top of page

Challenge 036 | Power Automate X Azure Functions

Recently, I needed to add some PowerShell to my automations. I started with Azure DevOps pipelines as that is what I've been using before, but I needed something not specifically related to deployments, so it didn't feel right. I moved towards Azure Functions which I always wanted to look into, but didn't, until now. And guess what? It's easier than you might think! At least, thought so. So I want to share what I've learned.

Challenge Objectives

🎯 Learn more about cloud services

🎯 Extend your automation with PowerShell

🎯 Improve your Visual Studio Code skills

Introduction

If you have been working with Microsoft services, you probably have used PowerShell before, or have heard about it. It's a scripting language for managing servers, networks, and cloud environments.

There are multiple modules available, similar to Python packages if you are familiar with that. These modules contain sets of commands (cmdlets) that bring specific functionality. There are modules for Azure (Az), Exchange (ExchangeOnlineManagement), SharePoint (PnP.PowerShell), etc. The PowerShell Gallery neatly shows all the modules available and cmdlets it contains.

PowerShell modules run on .NET. Modern modules run on .NET Core, which is cross platform (Windows, Linux, macOS) and is supported from PowerShell 6.0. There are still quite a lot of modules that run on .NET Framework, which is Windows only. To run these, you need PowerShell 5.1 or earlier. You can forget which .NET version is supported by which version. But what you should remember PowerShell is not always backward compatible and that older (or legacy) modules thus require a different PowerShell version.

A fundamental aspect of PowerShell is that it runs on a machine. If you call an API, which is essentially what you are doing when you use an action in Power Automate, the compute has been taken care of by the API provider. You can simply just ask for information (a GET call), of tell it to do something (like a POST call). With PowerShell you need to provide the compute yourself.

A great way to do this manually is through the terminal in Visual Studio Code. You can either type it in there directly an press enter, or feed it a PowerShell file (.ps1) and run it. That's a great way to quickly adjust things or test some functionality. But when you want to automate these scripts, you need a machine to run it on. This is where the cloud comes in handy. We can just use a machine at the moment we need it. No need to handle the hardware ourselves. What is even cooler is that Microsoft created a cloud service called Azure Functions that lets us focus on the scripting. Microsoft handles the compute for us when we need it. What a wonderful world.

Creating the Azure Function

Creating an Azure Function isn't as straight-forward as creating a Power Automate flow is, but I will guide you through the process. I will do my best to keep it as low-code and easy as possible. Just make sure you have an Azure Subscription and Visual Studio Code installed with the Azure extension.

  1. Create a folder on your desktop /Projects/Challenge 036

  2. Open the Azure extension in Visual Studio Code and add your account

  3. Press Ctrl + K + O and select the folder you just created

  4. Press Ctrl + Shift + P to access the command palette

  5. Search for Azure Functions: Create New Project...

  6. Select the folder you created

  7. Select PowerShell (look at the available options for inspiration)

  8. Select HTTP trigger (look at the available options for inspiration)

  9. Name the trigger GetUserConnections

  10. Select Function for the Authorization level

The extension will now populate some files that are required for your Azure Function. It should look like the image below.

As this is a part that might be overwhelming, I will explain a bit what it all is about.

  • Everything in the .vscode folder is there for working on your Azure Function in Visual Studio Code.

  • The file .funcignore states which files to, yes ignore, when you push these files to your Azure subscription. You can view the file and see it is a simple file. As the .vscode files are just for your local machine during developing, this folder is already listed for you, so that these files will not move to Azure.

  • The .gitignore file has the exact same functionality, but stated which files should be excluded when working with a Git repo. When you are actively working on such functions, it is recommended to store the source files in a Git repo for version management and collaboration.

  • Then we get to the GetUserConnections folder. This folder contains the files that are the function with the HTTP trigger that we will be using. you can see the run.ps1 file, which is the file where your will write your PowerShell script. the function.json file contains information about this particular function. You can see that this file has the httpTrigger and authLevel stored to tell Azure what type of function we are dealing with.

  • There are now 4 more files left. The service Azure Function can run multiple functions. GetUserConnections is one of them. The requirements.psd1 is used to tell which PowerShell modules should automatically be installed. You can think of this as it is a brand new computer and what modules should be installed in order for my function to operate. The profile.ps1 file is also there because we are working with cloud computing. As mentioned, we need some sort of compute to happen. When the function is triggered, a cloud machine is provided to you for a short period of time. This file will run every cold start, which means every time the machine is provided to you, and before any function will run. You can think of this as a restart of your machine and which applications should be opened. Besides installing modules, you need to import them. You can manage that import centrally and just focus on the script itself in the run.ps1 file. The host.json tells something about the azure function as a whole (like a settings file) and the local.settings.json is again for your local machine.

I hope this clarifies what actually is visible on the screen and you get the confidence to proceed.

now type func start in the terminal. If you encounter some errors, read them and install what is needed.

You should get a screen like this. You can see the localhost url. Copy that to a browser hit enter. You should get the response in your browser as shown below.

What is happening now is that your machine is acting as a local server in your network. In your run.ps1 file you can see what is actually happening. The name is extracted from the Request query (in a URL the query parameters start after the ? and is separated by &) with some logic the body is defined and returned. Now add the name parameter to the URL like shown below.

Pretty cool right? This is now like a local API. Now that it works, you can deploy this to Azure. This is easily done with the Azure extension. Go to the Workspace section, select the Function App icon, and select Deploy to Azure... and follow the instructions.

It might take a minute, but then it is alive in the cloud.

Now open the function and select Get function URL. You will see three options. The endpoints are identical, but the authentication token differs. As you can see, this is provided as a query parameter named code. In the beginning we selected the authentication type on function, which is why there is a third option, the function key. This is like an API key for your function. The other two are for all the functions in your Function App. You should store this in a Key Vault, but that is out of scope for now. If you want to learn more on that, visit Power Platform Challenge 028. If you want even more security, you might want to put Azure API Management, but that is also out of scope. I just don't want you to do unsecure stuff.

Copy the URL with the function key and add (with the &) the name parameter and paste it in a browser.

Your PowerShell script is now providing a response while running in the cloud. You did pretty good already. Do you notice that when you enter the URL at first it takes longer than when you request it again? That is because of the profile.ps1 file kicking it at the cold start. With the consumption plan, the machine stays idle for a while (between 5 & 20 minutes) , and when no request are received, it wipes itself and becomes available for some other tasks. Other plans can keep it running, but that obviously comes at a cost.

WARNING

Now, let's update the function so that can get something of interest. To be fully honest, I wanted to get the connections of a particular user. You can see that based on the naming of the function. The issue with putting this in an Azure Function is that the module that is required for that cmdlet is Microsoft.PowerApps.Administration.PowerShell. This is based on .NET Framework instead of .NET Core. I tried many different things, but after some investigation I found out that it only supports .NET Core. That's why I mentioned it at the beginning of this blog and decided not to update the images. Just to be transparent and share the limitations. I still need that functionality for work, so I will be looking into Azure Automation runbooks and managing my own VM. I might share that in a future post.

AND CONTINUE

That doesn't mean that it isn't of value. I am just unlucky that I opted to go for a legacy module. But there are lots of modern modules. And as it is great to learn about Azure Functions, we just have to switch module. So instead of getting Power Platform connections, we want to get all the resource groups within a tenant. This uses the Az.Resources module. I came up with the following PowerShell script.

using namespace System.Net

# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)

# Import the necessary Az modules. These modules should be specified in your requirements.psd1 file.
Import-Module Az.Accounts
Import-Module Az.Resources

# Connect to Azure using the managed identity assigned to the Function App.
Connect-AzAccount -Identity

# Retrieve all resource groups in the subscription.
$resourceGroups = Get-AzResourceGroup

# Prepare the output as JSON so that the Function returns it to the caller.
$response = $resourceGroups | ConvertTo-Json -Depth 4

# Send the response back to the caller
Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
  StatusCode = [HttpStatusCode]::OK
  Body = $response
  Headers = @{ "Content-Type" = "application/json" }
})

At the bottom section we bind the Get-AzResourceGroup cmdlet output to a variable, convert it to JSON and pot it in the response body. Thanks to LLMs we don't have to know this by heart and as you can see, many lines remained or are still familiar from the template files the extension created for us.

The upper section is something that I think might be of interest. I will start with the Connect-AzAccount cmdlet. This cmdlet is to connect to your Azure tenant and provide credentials. But instead of username/password (impossible for automation purposes due to MFA enforcement), or applicationid/clientsecret (requires secret management), we only provide an identity. I will show you how you can set it up.

First you need to deploy your function to Azure. You know how that works. In your Function App, You can find the section for Identity. Toggle it on.

Next, you need to provide it a role assignment. As we only want to read all the resource groups, we need to set the Scope to Subscription, and set the Role to Reader. Save it, and you are done. That's all it takes. How simple is that! This is even considered more secure than the Client Secret option. Have you also noticed the other options at the scope level? Just amazing.

So now the authentication is managed. The last thing I want to highlight is the Import-Module section. As you can see, we only import it. We don't install the module. The comment already tells you where I've put it, this is in the requirements.psd1 file. I updated this file to the snippet below, and boom, we are done!

# This file enables modules to be automatically managed by the Functions service.
# See https://aka.ms/functionsmanageddependency for additional information.
#
@{

    # Uncomment and configure as needed for other modules like Az.Accounts and Az.Resources.
    'Az.Accounts'  = '2.*'
    'Az.Resources' = '3.*'
}

Updating it should now be a piece of cake. You can test your Function from within Azure. Calling it From Power Automate now is also a piece of cake. We know the URI. We can add the function key to the URI section, but it is neater to put it in the header. You can use the x-functions-key property for that. Make sure to use secure input, and move the key to Key Vault.

That's it. You've created an Azure Function which and triggered it from Power Automate.

Conclusion

Creating Azure Functions is easier than it sounds. The Visual Studio Code extension for me really simplified the who experience. But the biggest takeaway for me obviously is the hard switch between old and new modules. I've learned it the hard way. Those experiences are helpful.

Additional Information

You could put create a simple Canvas App with multiple buttons for quick access to admin stuff. Or map them to your Streamdeck for the ultimate admin cockpit (Please share pictures if you do).

Key Takeaways

👉🏻 Use Visual Studio Code Extensions

👉🏻 Managed Identity is easy peasy

👉🏻 ONLY .NET Core

コメント


© 2022 by Miguel Verweij

bottom of page