Continuing the topic of automation, PowerShell, and Azure DevOps, in this blog post, I would like to share some ideas on implementing continuous deployment for Azure Automation accounts. Despite some slowdown in service development, Azure Automation is still widely used in many environments to reduce toil and make engineers’ life a bit easier. It is a common tool for hybrid scenarios when some parts of your process automation are located on-premise or hosted by another cloud provider.

Before I dig into the technical stuff, let me explain what Azure Automation pitfalls forced me to write this article.

If you don’t want to listen to my grumbling, just skip the next section.

Side note

Although the idea of a service that should be your central hub for task automatization is great, the management of it is not an enjoyable experience on many occasions.

Partially, that can be explained by a fragmentary nature of its components as the Configuration management, Update management, and Process automation are not interconnected and configured separately. The Configuration and Update parts are more about regular system administration of Azure VMs and on-premise servers. On the contrary, the Process automation can be applied to a much larger variety of tasks, either in the cloud or the local network.

Another caveat is usually more of a service design rather than about its technical features. In many cases, engineers think in terms of centralized IT and deploy a single Automation account per subscription. Although it might seem easier to manage a single resource, with time, it becomes a messy collection of scripts, configurations, and other assets that are either used for entirely unrelated purposes or connect resources from multiple services/applications creating unwanted external dependency.

To make things worse, such Automation account assets as DSC configurations, runbooks, modules, variables, and credentials are often created and configured manually without proper versioning, testing, and dependency management. As I wrote in my post about DevOps in PowerShell automation, it is a paradox that, in many cases, tools that should help in implementing DevOps practices are used in contradiction to them.

Technical challenge

Such Azure Automation components as the Configuration management and Update management are pretty focused on specific narrow tasks, and generally don’t cause lots of trouble. On the contrary, the Process automation is much more diverse in its application.

First of all, such Azure automation assets as runbooks, PowerShell modules, variables, credentials, and schedules are managed in a somewhat inconsistent way. For example, there is a well-known and long-lasting difficulty with module deployment to Azure automation that requires a module zip file to be accessible via a public URL. Secondly, if you employ Hybrid Workers for your automation scenarios, you have to manage and update the required PowerShell module by your own means. Thirdly, to maintain module version consistency, you need to organize an update process for the modules imported into an Automation account and the ones deployed onto the workers. All that creates additional roadblocks when you try to work with Azure Automation according to such DevOps practices as versioning, automated testing, automated deployment, and configuration as code.

I already wrote a few posts about using DevOps practices when working with PowerShell, creating a CI/CD pipeline for custom PowerShell module and consuming it from Azure Artifacts. The next logical step was to extend the same approach for deploying and managing PowerShell-based assets – modules, runbooks, and DCS configuration in Azure Automation.

Tools in hand

Azure Automation has a rich API that can be accessed, for example, with the corresponding PowerShell cmdlets. Technically, that provides you with plenty of options to create custom deployment scripts and invoke them in a deployment pipeline. However, the issue with such custom scripts is that you will have to copy and maintain them in multiple projects provided that you don’t fall into the fallacy of using one Azure Automation account across your whole infrastructure.

Instead of implementing the deployment in a ‘quick and dirty’ way, I decided to look for something more reusable and easier to maintain. As I already had some experience with PSDepend for dependency management, I turned my attention to PSDeploy – a tool to simplify PowerShell-based deployments with a declarative approach for defining the deployment configurations. It is quite well documented and, what is more important, extensible.

It took me some time to figure out how the whole deployment process works in PSDeploy and to create corresponding deployment scripts for Azure Automation runbooks, modules, and DSC configurations. The script logic for runbooks and DSC configuration is pretty simple and hardly needs an explanation. For modules, however, I had to implement deploying from three different sources – local source, public PowerShell gallery, and from a private feed, to make it a more versatile solution. Initially, I even thought about automatically creating a Storage account for module files upload for handling local and private modules, but later decide to abandon that functionality as it might create probably unwanted changes outside of the deployment target scope.

So, now I can define a deployment configuration like the following in source code:

And initiate the deployment with a single invocation of PSDeploy in a pipeline:

>Invoke-PSDeploy

Sample project

To illustrate the concept, I created a sample project that can be used as a starter template when building your deployment pipeline for an Azure Automation account. The project contains the runbook and DSC configuration to be deployed to an account along with referencing a few PowerShell modules. Also, for demo purposes, I introduced a dependency between the runbook and the modules. I didn’t put the source code for the module into the same repository as I prefer to treat modules as context-independent tools.

You can find more reasoning on that approach towards PowerShell modules in the great book “Learn PowerShell Scripting in a Month of Lunches” by Don Jones and Jeffery Hicks.

The DCS configuration deploys the same modules to assigned nodes so that you should get a good understanding of how you can maintain the list of PowerShell modules installed on a Hybrid Worker in sync with the list and versions of the modules imported into an Automation account.

Of course, nothing prevents you from going with Infrastructure as Code (IaC) practice for 100% for all the pieces. However, I haven’t created the deployment scripts for such Azure Automation assets as schedules, variables, credentials, connections, and certificates because they are usually environment-specific. Additionally, depending on your use case, it might be more reasonable to manage your infrastructure to the maximum using one unified approach like ARM or Terraform templates, when treating PowerShell artifacts as the parts of your application to be deployed along with it.

For the same reasons, I preferred not to assign DSC configurations to specific nodes during the deployment via PSDeploy. The deployment script will only create/update the configuration and compile it to be ready for usage. If the configuration already existed and was assigned to nodes, its new version will be applied to those nodes automatically.

To conclude on this sample project, it should be treated as a tradeoff between manual deployment and fully automated one rather than a best practice. For example, in one of my work cases, I applied this approach in an environment where all existing automation scenarios were based on a single centralized account, and it was cost-ineffective to decouple and refactor them in a single swipe, straight and clean.

To be continued

As I wrote at the beginning of this post, the Process automation part of Azure Automation has many different applications and, therefore, there is no single right approach to design it. For example, in simple cases, when I just want to validate a concept, I’m totally fine with just deploying stuff through the Azure portal. Whereas, for long-lasting and production-ready solutions, I prefer to spend an additional chunk of time for defining their configuration in code and validating its deployment.

Regarding the automation in Azure as a broader topic, I’m currently looking into implementing new automation tasks with the help of Azure Functions. With their support for PowerShell Core and the Hybrid Connections, Azure Functions can be a great alternative to traditional implementations of automation scenarios on Azure Automation. Plus, they make the management of PowerShell modules way easier. So, stay tuned and subscribe to my blog to get updates on that topic 👇