Category Archives: General

General stuff, usually just stuff related to networking and admin.

Running Azure PowerShell Workflows without Azure Automation

While developing some Workflows recently I was frustrated by something that was seemingly so simple to resolve. I’d log in to Azure and then run my parallel Workflow to perhaps start up a set of VMs in a Resource Group and get slammed with a boat load of errors.

This is the code I was running – it is specifically not designed to run in Azure Automation because I don’t want it to. Running this in Azure Automation requires a significant number of steps that I don’t want to perform so I’m not using the RunAs account. It’s a few lines of PowerShell designed to start up the VMs in a resource group quickly.

This would just blow up. The problem was clear enough, as a child of the parent, the parallel execution has no knowledge of my account being logged in and I just wasn’t sure how to resolve that. Many moons ago I tinkered with using Save-AzureRmProfile  to save me from having to log in every time I wanted to do something in Azure but I fell out of love with the solution since the saved profile never seemed to last very long. After some Googling, I happened upon an issue report in GitHub for Azure PowerShell that seemed relevant.

The issue report was answered by Mark Cowl who advised that the child job has no knowledge of the profile and, by passing in the profile to the child job in the right place by using Select-AzureRmProfile , you could work around the problem. The lightbulb came on and I altered my script and Workflow to:

Simple when you think about it.

You’ll notice in the foreach -parallel there is a throttlelimit set. I have my reasons for including this, yours may differ so feel free to remove the throttle altogether or alter it appropriately.



Set language, culture and timezone using PowerShell

As much as I’d like to say this is a perfect one-liner solution, it isn’t. It doesn’t actually count as PowerShell either, technically. This article is written for people deploying from Marketplace Azure images that always get the US locale, even though you need it to be GB – clearly if the image you deploy from has been customised or is already based on the en-GB image of your OS, then you probably won’t have these problems but when you deploy from the Marketplace image, you get en-US settings.

I’ve seen the cmdlets introduced in Windows 8 and Windows 2012 that allow you to set the language, culture and timezone but these are lacking. The primary issue is that they do not contain any method to affect the Default user account, so any time a new user logs on, they get en-US language settings, which for anyone in the UK is a huge pain. The other issue is compatibility. Not everyone is ready to deploy Windows Server 2016 or even Windows Server 2012 and so are stuck on Windows Server 2008 R2, for better or worse.

If you’re deploying in Azure, Microsoft provide a market place image that makes life very easy, instead of rolling your own image and deploying from that. While Managed Disks are the first step to removing the frustration with handling your own custom images, they’re brand new and with anything new, it takes time to bed in and become part of a well-honed process.

With all this background covered, there are, undoubtedly full PowerShell solutions that will achieve the same as I’m about to give but they are also unquestionably more complicated to deploy and maintain.

To resolve the issue of deploying from Azure Marketplace images that have en-US set as default across the board, I implement a CustomScriptExtension for every VM build. I won’t cover the detail of the approach here, but I will provide a link to an Azure article that I contributed to that covers using Custom Script Extensions, specifically with private Azure Blob Storage. – I sent the pull request to update the document about 5 days ago but it’s not been accepted yet. The PR is here if you cannot see any mention of my request in the article yet.

The solution I implement requires two files to be downloaded from private Blob Storage, one is a PowerShell script, the other, an XML file which is where the language settings are. The CustomScriptExtension then runs a command line which invokes the Powershell script and changes the language settings and timezone, among other things.

I’ve deployed thousands of Azure VMs and this CustomScriptExtension has been instrumental in saving days of manually changing settings. The following code is the PowerShell (well, OK, there’s some PowerShell in there) to initialise the VM with language, locale, culture and timezone as well as format/prepare any RAW data disks – which you might not want to do based on the requirement or not for Storage Spaces/striping.

You’ll notice that line 2 actually calls control.exe and imports an XML file called UKRegion.xml. Here’s the contents of that file – this is specifically coded for UK (en-GB) language, home location etc. so if you’re elsewhere in the world, you’ll need your own location’s codes and what not.

If you’re not trying to automate Azure builds and tinkering with JSON ARM templates isn’t required, then you need only the PowerShell and XML as shown above. Simply copy the PowerShell code and XML code in to two separate files (named appropriately), drop them on to the system you want to update and run the PowerShell. If you don’t want the disk format section, remove it. I should warn you that you DO need a reboot to fully apply the settings to the whole system.

For those of you doing Azure based deployments and want to take advantage of the above during deployment using a CustomScriptExtension, the resource I use in my JSON Azure Resource Manager template looks similar to the following – I actually have all of my inputs parameterised but you’ll want to see the information I’m actually passing in so I’ve manually edited the resource below, substituting in the values that would be passed from the parameters section (which isn’t shown below!). There is also the Set-AzureRmCustomScriptExtension cmdlet if you’re building with Azure PowerShell.

“Why not use DSC!” I hear you scream. Absolutely, yes, you can definitely do that but I can tell you for free that you will reach a fully configured solution that you can log on to and start using far quicker with this method than you will if you use DSC which might have to install WMF, reboot, compile the MOF, configure LCM and then apply the config. It’s all about the right tool at the right time and for simple[?!] language settings and formatting disks, a CSE is the way to go as far as I’m concerned.


Clean up orphaned Azure resource manager disks

Firstly, let me apologise, this post isn’t about automatically cleaning up disks in Azure Resource Manager storage accounts, instead it’s about obtaining the information which you can use to understand which disks are orphaned and can or should be deleted.

When deleting a virtual machine on its own, I’m sure you’re aware that the disks are not automatically deleted as part of that process. The recommended best practice is to put every resource that shares the same lifecycle in to the same resource group which can then be deleted as a single entity, removing virtual machines, storage accounts, NICs, load balancers etc in one hit.

In some cases, there are occasions where it’s preferable to delete a single virtual machine on its own – doing this leaves orphaned disks behind in your storage accounts, gathering dust (and costing money!) unless you manually delete the disks yourself. We all know that as we stand up and delete VMs in Azure, there are occasions where cleaning up just isn’t convenient and the disks get left and forgotten about. Many moons later we decide it’s a good time to clean up those orphaned disks but finding them manually or using Storage Explorer is a painful process when there’s potentially hundreds of storage accounts in a subscription.

Now that Managed Disks have gone Generally Available, this problem will occur less and less for people using those but for now, most people would benefit from understanding their VHD/storage account layout and usage.

I wrote the following PowerShell script to iterate over all (or some of the) storage accounts in a subscription, pulling out the details of any “blob” in any container, ending with .vhd. Rather than trying to identify unleased blobs and automatically deleting them (which is very dangerous when you have Azure Site Recovery protected objects) I chose to simply generate an object containing the information of every blob (ending with .vhd). For those of us happy using Powershell for filtering, we can then filter the object as we see fit or just output to CSV and filter in Excel for example.

You can target this to all storage accounts, some (via a regex match or like condition) or a specific account by adjusting the Get-AzureRmStorageAccount  switches which act as the primary filtering mechanism. This script is non-destructive, it DOES NOT delete anything.

In my own tests, operating against ~90 storage accounts in a  single subscription, retrieving data on ~600 VHD objects, the information was populated in to the object and a CSV created in ~50 seconds. You can of course include additional information in the object by adjusting the object properties accordingly but I would highly recommend that you do not use any cmdlets inside the foreach loop, instead get the source information outside of the loop and then filter for that information from the object inside the foreach, like I do with $StorageAccount.

You can use the CSV to identify objects that are surplus to requirements, badly named, in the wrong account/type and act on the information as you desire, but don’t be tempted to auto-delete anything from its output unless you’re supremely confident you know what you’re doing and especially not if you use Azure Site Recovery to protect on-premises machines…doing so is a Bad Idea™

Before I disappear and as a small bonus for reading this far, in my quest to gather this information elegantly, and before I properly investigated the ICloudBlob object’s properties, I also put together the following function which, given an HTTPS endpoint and the storage account key, uses the Azure Storage API to get the properties of a blob as response headers from an HTTP request. If nothing else it shows how to interact with the Storage API via WebRequests in PowerShell and construct an authenticated request.

As a quick update, the above function can also be updated to easily become a replacement for Get-AzureStorageBlobContent  by changing the method to GET and updating the return object. Why would you do this? Well, in my scenario I might be using Azure Automation and don’t want to download a file to then open it and read its contents. Get-AzureStorageBlobContent  insists on downloading the file and I don’t want it to do that, the below function uses the Storage REST API to get a file’s contents as an object – obviously not something you’d do with a VHD but a CSV or JSON block blob for example isn’t large, is parseable and sits in an object nicely.


Censoring the UK – no porn please, we’re British #DEBill

Won’t somebody think of the children!

So they’ve finally done it. The UK Government has announced that they’re planning on censoring access to tens of thousands of websites featuring “unconventional” pornography and erotica, all in the name of saving the children. The inference here is that anything unconventional is considered illegal and must be blocked. Being gay was unconventional in the 40s and 50s, so people who were gay were persecuted and prosecuted. The man who helped end World War 2 – Alan Turing – was gay and he eventually took his own life after being convicted of the “crime” of gross indecency based on his relationship with a young man.

Even as consenting adults – ADULTS! – the Government believe it is somehow their right to tell us what is conventional and therefore acceptable. It is delivered under the guise of protecting children when in reality, its purpose is to define what they consider to be acceptable/conventional by censoring. If it were really the Government’s stance to protect the children, they should be educating, not censoring. I don’t mean educating children, I mean educating adults. It is no longer acceptable that in the Internet Age there are adults out there with children who have no grasp whatsoever of how to protect THEIR OWN CHILDREN online yet provide them with iPads for Christmas.

Investigatory Powers Act 2016

A few days ago I was astonished as the IPBill made its way through parliament with barely a whimper from UK citizens. Seemingly there’s not enough people in the UK who care about their online privacy and don’t mind that the likes of the Food Standards Agency and their local Fire Service can now browse their Internet history. After all, if they’ve got nothing to hide, they’ve got nothing to fear, right? For now that may be true but do you implicitly trust every person in the Civil Service that will now have access to your Internet history? If you have ever cleared your browser history – and I mean EVER – then you should have stood up to be counted as someone who opposed the #IPBill. That it has passed and is now all but law means you’ve lost your chance.

Consider a future where the Investigatory Powers Act 2016 has been in use for years:

You have a son who you know is gay though it is a closely guarded family secret because he doesn’t want it to be common knowledge. He has used the Internet in your home since he was a teenager when he came out to you. He has expressed an interest in joining the Fire service. He’s physically fit and the local Fire service are in desperate need of new recruits but his application is turned down. On pressing for a reason he’s told that he is incompatible with the Fire service and they won’t be progressing his application. You take up the challenge on your son’s behalf and speak with the Fire service to ask for an explanation. You are told that unless you want your son’s sexuality to be discussed in public, you should withdraw your challenge. How could you ever approach this without exposing your son’s sexuality – something that is very private to him?

The point I’m trying to make here is that while you think your Internet history is boring, the Government don’t think it’s boring – otherwise they wouldn’t want access to it. You have absolutely no idea how that information will be stored, accessed, by who, when and for what reason. This is not the only example I could give – consider an Ashley Madison style hack and data leak of billions of internet connection records. In the days after such a leak, the number of people committing suicide and filing for divorce would skyrocket and the impact would be devastating. You may think that what I’m saying is all very “tin-foil-hat” like but can you honestly say that this could or would not happen? If you think you can, I’m telling you you’re wrong, very very wrong.

I work in IT (having worked in both public and private sectors) and I have zero confidence in any private company or public entity storing personal information regarding my personal browsing habits. I know that one day, that information will be leaked – so I must take steps to protect myself from Government monitoring.

Digital Economy Bill

As for porn blocking, the Government are entirely reliant on the public’s apathy and embarrassment in raising issue with their proposals and they know nobody cares enough to stand up and be counted for fear of being labelled a pervert, or worse.

When I found out about the Government’s proposal to ban unconventional pornography, I mentioned it to my fiance. Her reaction was absolutely priceless and is exactly what the Government are relying on to prevent much scrutiny, in both parliament and public. Scrunching up her face, seemingly disgusted that I should be concerned about something like this, she said: “Why do you care? Are you in to unconventional porn?” – the simple fact that I had raised it as a talking point was enough for her to form an opinion of me, despite knowing me for nine years and accepting my proposal.

We are all private people and we don’t want anybody other than our sexual partners to know what our sexual preferences are. This attempt (and I sincerely hope it is only an attempt!) by the Government is to route out or silence people with “unconventional” sexual preferences. If they succeed in introducing censoring of “unconventional” pornography, they will only succeed in forcing the individuals that have “unconventional” sexual interests underground, making them and the pornographers criminals in the same way that Alan Turing was considered a criminal, causing him to take his own life. Alan Turing is a man now celebrated in film and revered in the intelligence services – the same services that are now gathering as much personal information as possible on every person in the UK. Not only have they put in place the means by which they can identify people accessing “unconventional” pornography, they are making it illegal so as to enable them to prosecute it.

I am so bitterly disappointed in the Government and so bitterly disappointed in the public that I have essentially lost all hope for Britain. The Government’s proposal to introduce age verification was doomed to fail from the moment it was suggested. They knew that implementing age verification would fail for many sites that don’t care about UK law so the Government had a reason to suggest blocking non-compliant sites. In order to block any site that doesn’t introduce age verification they will need to whitelist, which absolutely WILL affect other sites not related to pornography.  To have a whitelist they must classify pornography and decide what is acceptable and what is unconventional. Make no mistake, this is the Government telling you what is normal. Remember, the type of content they want to censor is not illegal but they’re proposing to make it so and any site that doesn’t introduce age verification for UK users will also be blocked – whether their content is “unconventional” or not.

Surveillance controls and absolutely surveillance controls absolutely.

Surveillance controls and absolutely surveillance controls absolutely. We are at the slippery precipice of a very very dangerous fall and if this amendment to the Digital Economy Bill passes through unscathed, blocking of “unconventional” porn is only the start. Do not sit down and tell yourself it doesn’t matter to you or that your porn interests are conventional. It doesn’t affect you but it does affect hundreds of thousands, if not millions of other law-abiding normal people.

Stand up, be counted and oppose censorship in the Digital Economy Bill by writing to your MP and telling them to oppose the amendment. Censoring helps nobody.

PowerShell for Seasons

I wrote this little snippet to assist me with working out the current season (as in Spring, Summer, Autumn and Winter) with an accuracy of +/- 2 days. No, it’s not a method of calculating the exact time and date that the equinoxes will occur in a certain hemisphere but it’s good enough. I should mention that this is for the Northern Hemisphere so if you want one for the Southern Hemisphere, it should be pretty simple to work out what to change! The function will tell you the current season (if you don’t provide a date) or the season of a certain date you give it.

I looked at Wikipedia to see the next dates of the Equinoxes and they all fall on or around the dates below with an accuracy of +/- 2 days. If you want hyper-accuracy, don’t use this.


Using Azure DNS for dynamic DNS

As a long time user of DynDNS and a generally happy customer, I’ve paid for their dynamic DNS services to provide my regularly changing home IP address to keep my website up and running using a combination of a rather badly named DynDNS domain zone with CNAME records for my actual domain resolving the DynDNS zone.

I’ve avoided taking up their rather expensive Managed DNS services at $7 per month to provide a very simple service but a recent job I’ve been working on has had me looking at all of Microsoft’s Azure services in detail. Despite it being in preview, I decided to take a closer look at Microsoft’s Azure DNS to see what it can do.

In essence, Microsoft Azure DNS allows you to host your domain zone and records in Azure but not to purchase a domain. For example, to use Azure DNS you must use a domain registrar to purchase a domain (or have one already) and then delegate the zone to Azure DNS by altering nameservers of the zone to Microsoft’s. This is usually very straightforward to do but differs between registrars. My domain registrar is 123-reg so their nameserver update page looks as follows. Ignore the nameserver settings here, yours are likely different.


I’m not going to go through the process of signing up for Azure etc. There’s plenty of documentation out there covering the process in graphic detail.

As you know from the title of this post, the reason for doing this is to implement a dynamic DNS solution for your own domain using Azure DNS. Typically, for static IP addresses this is completely unnecessary but in my circumstances, where I host my personal website from my own home network and have a frequently changing IP address, it’s a necessity. Compared to DynDNS’ managed DNS services at $7 per month for their lowest bracket ($84 per year!), Azure DNS is an absolute STEAL, charging only 15 pence per zone, per month for the first 25 zones. This pricing represents a 50% discount over production prices while the service is in preview but even then, if I had 5 zones hosted in Azure, I’d spend less than 25% on DNS costs compared to DynDNS.

Just to re-iterate,  Azure DNS is in preview for now. For my purposes (personal use for my own blog) it’s fine. If people can’t reach my site for a short while, it isn’t the end of the world – that said, I don’t expect that to happen.

So, back to it. We’ve got a domain and we have delegated it to Azure DNS by changing the nameservers at our registrar’s site. So how do we use this for Dynamic DNS? Azure DNS exposes three methods of configuration, unfortunately a REST API isn’t one of them but PowerShell is. A short while ago I created a PowerShell script that could be used to update DynDNS records using pure PowerShell so this can be used as a template for integrating with Azure DNS. The trade-off is the need to run a PowerShell script from inside your network. I’ve no doubt there’s a way to integrate this with Azure Automation with Webhooks and it may be a little extension project I use to expand on this article in the future but for now, this is a script that must execute from inside the network with the IP address you will use to update Azure DNS records.

The process itself is very simple; get the current IP address of your Internet connection, compare it with what’s set for the A record in the Azure DNS zone and if it doesn’t match, update the record in the zone. Easy really.

Here’s a script that’ll get that done.

Initially it starts out by obtaining the Internet connection’s IP address from the router, you could just as easily do this by obtaining the IP from something like my own tool. – as it is, asking my own DD-WRT router is less expensive.

I then compare the current IP address with the value currently set on the record I’m interested in – in this case  – just the apex/root record, not www .

If it isn’t the same, I bounce on to Azure DNS and forcefully update the record and enforce a 60 second TTL.

Using this method I can avoid paying for DynDNS services at all and rather than using CNAMEs for my domain names, I can use actual IP addresses.

– Lewis

Hive Active Heating PowerShell Control with PoSHive

Last week I announced PoSHue, my PowerShell 5 class for controlling and scripting Philips Hue lights – this week sees another announcement along the same lines.

I recently bought a Hive Active Heating system to remotely control my home’s heating and thought it would be pretty cool to be able to access that same level of control (and automation) using PowerShell.


PoSHive is the result. It’s another GitHub project meaning anyone can get a copy, fork it, branch and contribute.

UPDATE: It’s now also available from the PowerShell Gallery.


Install by simply using:


When you want to ensure you have the latest version:

Why would I want access to Hive using PowerShell? The purpose of this class is to enable you to use PowerShell (v5) scripting to exert more powerful logic control over the state of your heating system. In its basic form, it allows you to set the heating mode of the system and the temperature, including the Boost option, Holiday mode and even to advance the system to the next event. PoSHive offers most (if not all) the features exposed by the app and website but using only PowerShell.

I realised a different benefit for PoSHive as I was thinking about use cases and realised that it opens the Hive Ecosystem (heating only) to those with disabilities that means it’s hard or even impossible to use an app on a phone or the website simply due to impaired sight or the fine motor control that you may need to use a mouse or point with a finger. The app offers disabled users the chance to simply type commands and have the heating system respond to them. No fiddly apps or websites to contend with.

Here’s a basic example showing how to get the current temperature recorded by the thermostat. The first 4 lines of this script can even be included in your PowerShell profile so all you need to type is $Hive.Login()  and you’re logged on.



Please check out the project on GitHub and contribute if you can or just let me know how you use PoSHive in your script through the comments section below.

The project is not sanctioned by or affiliated with British Gas in any way and is based on API data formats and responses I’ve observed for my own Hive Active Heating system. The class is designed to work only with the Heating only Hive system (I don’t have the hot water system unfortunately) and is likely also not to work with Multi-zone Hive systems.


Philips Hue PowerShell

I’ve been quietly working on a little project (or two) of my own on GitHub since I got some Philips Hue lights a while back.

Philips makes accessing the bulbs programmatically very easy with the API that exists on the Bridge device but I wanted a scriptable solution to allow me to exert much more fine grained logic control over the states and colours of my lights.

Being pretty advanced with PowerShell (at least, I think I am), I set about writing a PowerShell interface (not a GUI) to allow me to access the properties and set the state of my Hue lights.

The result is a PowerShell 5 class that simplifies the interaction with Philips Hue bulbs and lights that I’ve dropped on to GitHub for use by any and all. I realise this is focussed purely on Windows users but that’s what I am and I use PowerShell extensively for other things too.

The project is called PoSHue and is located on GitHub.

It allows you to do things like this from PowerShell.

Feel free to have a look and see how you can use it. Just 4 lines and you’re off and running.


One example is something I’m using the classes for currently but is logically quite complicated. The script executes on a schedule, that schedule is set from the previous execution and is obtained from an API call to a service providing sunset times. The script turns the lights on just before sunset but only if me and/or my fiancee are home.

I then have a second script which is executed by the “turn lights on if it’s sunset and people are home” script which monitors if we go out. If we go out, the lights are turned off by this script and, so long as it’s before 23:00, the turn lights on only if we’re home script is executed again to wait for us to come home again.

Basically, the scripts work in conjunction and cyclically to ensure the lights don’t turn on before sunset and only when we’re home and they also turn the lights off if we go out but would turn them on again if we came home before 23:00.

Let me know if you’d be interested in seeing the scripts and tasks (yes they’re scheduled tasks that monitor for return events from the scripts!) and I’ll see what I can do about packaging them up somewhere.