The Cloud

The idea of ‘cloud’ computing has been around since the first web based email services but only in the past few years has the term become mainstream.
Here I look into software that runs somewhere other than on your PC.

After a few years on hold it’s great to be back at CITC this time in the British Motor Museum. The video presentation covers a short (if speedy) introduction to Node-RED and it’s ability to integrate systems through their APIs.

Demo 2 is of note and shows how a user visiting a malicious website can have their internet access revoked until a helpdesk agent has a chance to clean-up their machine and then grant access again without even having to login to the firewall controlling that access.


When setting up a GlobalProtect Portal/Gateway with AzureAD you may find you receive the error message:

AADSTS700016: Application with identifier <Entity ID> was not found in the directory ‘<Directory ID>’.

The fix here is easy – the GlobalProtect client injects a :443 at the end of the domain name which isn’t mentioned in the guide from Microsoft (https://docs.microsoft.com/en-us/azure/active-directory/saas-apps/palo-alto-networks-globalprotect-tutorial) but is in the guide from Palo Alto Networks (https://knowledgebase.paloaltonetworks.com/KCSArticleDetail?id=kA10g0000008U48CAE).

Interestingly the Reply URL doesn’t specifically require it (although mentioned in the Palo Alto guide) but either way easy to fix.

It’s been a while since posting and I do hope to sort that out but for now another quick mention of some work with Ruckus Cloudpath.

Although massively flexible in its design I’ve come into a few niche cases where administrators would like a single DPSK pool (which is bound to a single SSID) but where different users have different expiry dates on those DPSKs. Thus far I’m planning on interacting with the API via Node-RED to update these entries in the API as the provisioning process takes place – something for another blog post.

However for those who are just getting to grips with the API (using PowerShell in my case) I hope the short example in this GitHub repo can be of use: https://github.com/jamesfed/RuckusCloudpathAPI.

ResultIt’s a bit of an odd situation but sometimes you might want to take information from a cloud service in this case Cortex XDR from Palo Alto Networks and drag it into an on premise logging service. This guide will have a look at how to get this log data in as well as parse it such that you can break out the individual fields in the log entry.

In looking at the documentation it appears that the logs are in the Comment Event Format (CEF) but are then wrapped up in syslog for transmission. Although Graylog can absorb CEF directly this additional layer of syslog means we have to take in the syslog and then send the event messages through a processing pipeline in Graylog to extract the CEF data.

So onto the guide – which assume you are familiar with the operation of the Cortex XDR management console and Graylog (shown version is 3.3), for simplicity the code snippet you’ll need is also shown below from GitHub.

Code snippet from the screenshot sequence:

In configuring the Microsoft Intune Certificate Connector and attempting to issue certificates to your client via Intune you might run into the error message below.

IssuePfx – COMException: System.Runtime.InteropServices.COMException (0x80094800): The requested certificate template is not supported by this CA. (Exception from HRESULT: 0x80094800)at CERTENROLLLib.IX509CertificateRequestPkcs10V2.InitializeFromTemplateName (X509CertificateEnrollmentContext Context, String strTemplateName)
at Microsoft.Management.Services.NdesConnector.MicrosoftCA.GetCertificate (PfxRequestDataStorage pfxRequestData, String& certificate, String& password)

Failed to issue Pfx certificate for Device ID 24c2445e-6cd2-4629-a942-081bdaca9b12 :

In short when configuring the certificate name to be used you’ve probably entered the ‘Template display name’ instead of the ‘Template name’ – note the difference in the screenshot where the template name doesn’t include any spaces.

Given the complexity of this feature I’ve found the guide at this link really handy in setting it up in the past:

https://techcommunity.microsoft.com/t5/intune-customer-success/support-tip-configuring-and-troubleshooting-pfx-pkcs/ba-p/516450

In putting together a small RDS (Session Based) environment on Server 2016 today today I kept running across the error message below during the installation.

Failed: Unable to install the role services.

After much back and forth between forums and event viewer it turns out our default policy to disable TLS 1.0 on servers was the issue. Enabling TLS 1.0 (through the registry or with the fantastic IIS Crypto – https://www.nartac.com/Products/IISCrypto) ended up sorting the issue for us.

One of my favourite features of PowerShell is the Invoke-RestMethod cmdlet which (among a great many other things) can download the data from an RSS feed. One application I’ve found for this is to stay on top of security bulletins from organisations like Adobe and Drupal.

However just downloading the data from the feed and kicking it out in an email isn’t quite good enough for my needs thus the script below gets data from a CSV which contains the URL to the feed as well as some extra details to inject into any email notification (e.g. a link to the guide on how to deploy Adobe Updates).

In my production environment this script creates tickets on a FreskDesk helpdesk to log and manage any new update notifications. In the attached example below the script just fires off email notifications.

Have a look at the screenshot sequence below for more info!

  Get-Rss (4.0 KiB, 3,187 hits)

Update 09/05/2017 – v0.2 – Now handles XML and Arrays in the link and title objects (good for reddit and blogspot!)

As some readers may know I currently work in Higher Education and while all of the business data is trivial to backup providing any level of backup service to students and academics is significantly harder. The challenges faced include the myriad of Operating Systems in use (Windows/OSX/Linux), the fact that the devices being backed up are inherently ‘untrusted’ (i.e. owned by the individual) and that they are often on networks (be it eduroam/public/home) that have no direct connectivity back to the internal trusted network.

Most enterprise class backup systems just aren’t suited to this kind of environment in that they cannot be securely published through a firewall or have exorbitant licencing costs for the number of devices to be protected (a few file servers vs 500+ student owned laptops).

One solution to this issue cropped up at a recent trade show where Synology were demonstrating their Synology DiskStation Manager NAS software which set itself apart from the traditional enterprise backup solutions with…

  • Support up to 16,000 users on high end models (and 2048 on the kind of model that we would consider using) with no extra licencing costs, users can have storage quotas set either by group or per user
  • Secure remote access (simply publish a single port which can be protected by HTTPS for encryption in transit)
  • Home grown backup clients for modern versions of Windows, OSX/macOS and Linux
  • On the point of OSX/macOS the backup client for Synology does not rely on Time Machine and so overcomes the issues associated with having to be on the same network as your backup device
  • Home grown Btrfs file system which auto detects (and fixes) corrupted files through metadata along with extensive snapshot support
  • Up to 32 recovery points and real-time file protection (when connected to the DiskStation)

So time for some screenshots! Below we have the initial setup of the Disk Station Manager and the installation of the client on a Windows PC.

Then restoring a file that has been deleted on the Windows PC; note that you can restore either individual files or entire folders to a point in time.

The same but for OSX…

So that’s all of the good, the only downside we have found thus far is while shared drives can be protected with encryption it is not possible to protect each individual home area (per user) with a unique encryption key thus opening up issues with data privacy. However, if you consider the following scenario…

  • A business needs to provide backup to remote workers
    • Those remote workers do not connect to the trusted network often
      • Perhaps they don’t like VPNs/DirectAccess (and so rules out using Offline Files)
    • and those remote workers do not use a commercial ‘cloud’ service to protect their data with
      • Perhaps trusting a 3rd party to host the data is not an option
    • The remote workers use OSX/macOS

…then using a Synology DiskStation should be a serious consideration for that business.