For quite some time I have had Power Automate Process Mining on the list of things I wanted to look into. I finally came up with a scenario that can benefit many Power Platform developers. This Challenge is exactly that. We will look into the capabilities of Power Automate Process Mining, and I will show you how all of us can benefit from this tool.
Challenge Objectives
🎯 Learn about Power Automate Process Mining
🎯 Understand what is required as an input
🎯 Use it on Dataverse and SharePoint solutions
Introduction
The technology of Power Automate Process Mining has quite a history. It started in 2013 as the product of a start-up called Minit. They released the first version in 2015, and three years later they received seed investment from a VC. During that time Microsoft started working on own tools named Process Insights and Task mining. In 2022 Microsoft aquired Minit and made Power Automate Process Mining GA April 2023.
So the technology isn't really new, but it is still somewhat new in the Power Platform space. So if you aren't familiar, it is a tool that you can feed event log files and it will give you all sorts of insights of the actual process. It shows you process maps of the frequency and performance, and can do more complex things like root cause analysis, comparing processes, etc.
The following links are great resources to learn the basics. I recommend following these, as I will not focus on learning the basics in this Challenge.
Although the instructions / labs / demos are great, starting this off for your own process might be challenging. How on earth do you get event logs for your process? The documentation has a small section on this. Getting it out of the systems is not that straight forward in my opinion. So that is the focus of this challenge.
Let's start with Dataverse solutions.
Use a Dataverse table as input
Audit logs is a standard feature in Dataverse. But there are a few things to know to make it work. There are three types of logs, and three levels of audit logs.
TYPES | LEVELS |
Audit logs | Environment |
Access logs | Table |
Read logs | Column |
To actually start the audit process, you must have this feature enables. You need to start this at the environment level. You can enable this in Power Platform Admin Center (see breadcrumb).
Environment Level
At the environment level you can see the different logging types. To me it is a bit confusing that all three types of logging are listed under Auditing, as Audit logs are one type of logs. That's why I am pointing this out. If you want to start recording changes on records, you need to click the checkbox Start Auditing. Once selected, you have the option to set the retention policy of these logs. depending on your requirement you can adjust this. Note that enabling this will affect your Dataverse log capacity.
The option log access is to keep track of who logged into your environment at which time, which might be interesting from a security perspective.
The read logs will keep track of who opened which record when, or even when a record is show in a view. Quite intense to my taste.
To be fully clear, only the first option is required. The others are optional and are just briefly explained for context purposes.
Table Level
Even when we have enabled Start Auditing, Dataverse must know which table it should keep track of. By default the tables are NOT audited. So even when you enable it on an environment level, nothing will happen. You can enable it per table at the properties section, or when creating a table.
If you haven't turned on auditing on the environment level, you will receive a warning.
For the ALM lovers, I normally keep auditing of for dev environments, and enable it for TST/ACC/UAT and PRD. The test environment could have a lower retention policy as this is mainly used for debugging purposes.
Column Level
You can also specify if a column must be audited. By default, this is enabled. So if you enable it on environment level and specified which table should be audited, you are normally done. I have never turned a column off yet, but it is good to be aware of this feature, as this might come in handy with more sensitive data.
Arranging these settings is mandatory before you will get any data out. If you want to keep track of these things, it therefor is good to think about these insights when you are developing your solution, and if you need to make some adjustments.
If you want to get this audit data into Power Automate Process Mining, you might think it is just a matter of selecting the Dataverse option. Well, that doesn't work, as that connector is not supported (yet). You need to use the OData connector, and then enter the environment URL (which can be found in Power Platform Admin Center) followed by api/data/v9.2. The link for me looks like the snippet below.
https://orge3e5e136.crm4.dynamics.com/api/data/v9.2
You can then search for the audits table, select next, and then filter on the objecttypecode (table) you want to get the items from. That's it. For Dataverse the main part of making it work is actually in setting up the auditing itself.
Let's see what is required from a SharePoint perspective.
Use a SharePoint list as input
Many Power Platform developers start with SharePoint as a data source. I am not going into if you should do this or not, I am just stating this as a fact. If you are keeping track of a business process, you will probably use a choice field to file the stage the process currently is in. That is the scenario for this part.
In Dataverse the audit logs are saved in a separate table and must be arranged per environment and table. A SharePoint list has version history on by default. This sounds like we should be able to use all the versions as audit logs. I started with investigating Purview, which has an audit capability. I did get some output from this, but only from my personal updates. Purview also requires some permissions that many Power Platform developers might not have. That's why I tried a different approach.
Then I came across this Stack Overflow thread. It describes how you can get all the versions of a SharePoint list in an XML file. That sounds promising.
Go to the SharePoint list you want to use as input for Power Automate Process Mining. In the ribbon, select Export to Excel.
This will create a query file. For those who are not familiar with this, This is a file that is connected to your SharePoint list to work with this data in excel. You can refresh this file whenever you need to. To be clear, it only gets data out of SharePoint into your query file (so not bi-directional). In order to query the list at every moment.
The next step is to open the query file. Not in Excel, but in Visual Studio Code (or any other IDE). For the table I just created, the file will look like the snippet below.
WEB
1
https://powerbouwer.sharepoint.com/teams/Challenge032/_vti_bin/owssvr.dll?XMLDATA=1&List=da318a54-b37b-4a2c-882a-52ac85bbac78&View=21104B6B-7BBB-4B6F-AACD-4CF00FF1DCF4&RowLimit=0&RootFolder=
Selection=da318a54-b37b-4a2c-882a-52ac85bbac78-21104B6B-7BBB-4B6F-AACD-4CF00FF1DCF4
EditWebPage=
Formatting=None
PreFormattedTextToColumns=True
ConsecutiveDelimitersAsOne=True
SingleBlockTextImport=False
DisableDateRecognition=False
DisableRedirections=False
SharePointApplication=https://powerbouwer.sharepoint.com/teams/Challenge032/_vti_bin
SharePointListView=21104B6B-7BBB-4B6F-AACD-4CF00FF1DCF4
SharePointListName=da318a54-b37b-4a2c-882a-52ac85bbac78
RootFolder=
On the third line there is an URL stored. We need to adjust this URL. The first step is to remove all URL parameters after List (from &View). The second step is to add the IncludeVersions=TRUE URL parameter. Somehow this URL parameter must be entered BEFORE the List parameter. For those who are not familiar with URL parameters, The first parameter comes after a question mark (?) and if there are more than one, they are separated with an ampersand (&). As a reference, the end result is shown below for my list.
ORIGIAL URL:
https://powerbouwer.sharepoint.com/teams/Challenge032/_vti_bin/owssvr.dll?XMLDATA=1&List=da318a54-b37b-4a2c-882a-52ac85bbac78&View=21104B6B-7BBB-4B6F-AACD-4CF00FF1DCF4&RowLimit=0&RootFolder=
ADJUSTED URL:
https://powerbouwer.sharepoint.com/teams/Challenge032/_vti_bin/owssvr.dll?XMLDATA=1&IncludeVersions=TRUE&List=da318a54-b37b-4a2c-882a-52ac85bbac78
You can now copy the URL and paste in in your browser. It should download a file named owssvr.xml. I recommend doing this with a table that already exists for a while.
Now open Excel and open the XML file. When prompted, select As an XML table.
Now when the table opens, the schema and data table are combined. This might be done easier, but this is how I managed to make it work. The Schema data are on the first columns and rows. The actual data follows. You can remove all the rows where the id column shows RowsetSchema. You can the remove all the column before the first column that starts with ows_. This is a prefix for all the actual SharePoint list columns.
You now have all the SharePoint list items with all their versions. You can save this file and use it as an input file for Power Automate Process Mining. You can use the following mapping for the required fields.
SharePoint output file columns | Power Automate Process Mining columns |
ows_ID | Case ID |
ows_<CHOICECOLUMN> | Activity |
ows_Modified | Event Start |
That's all you need.
Additional Information
There is a lot of functionality that I am not familiar with yet in Power Automate Process Mining. The basic maps that are generated are exactly what I was looking for. You can see lead time, how many items went through the process, etc. I also like the fact that it generates a process flow based on the data. This should be in line with how you envisioned the process. If not, there is probably room for improvement in administering the process.
If you keep it as simple as I do right now, or use all the more advances capabilities, I do think it is good to look into this and give it a try. Just like you iteratively improve your solution, we should look on how to iteratively improve the business process we create the solution for. It can even be a good indicator for prioritizing backlog items.
Key Takeaways
👉🏻 Power Automate Process Mining can be of value for every Power Platform developer
👉🏻 You should think about your auditing strategy before go-live
👉🏻 With some data flex we can even use a SharePoint list as input
Yorumlar