How to STOP Personal Power Platform Pipelines
After my last blog post where I basically say Personal Power Platform Pipelines have potential, but in their current form, I cannot recommend them for any sort of development, I did mention there is a way to stop them. In this post, I will discuss my method for preventing Personal Power Platform Pipelines from being created.
Easy, lets just go to the tenant settings and...umm.
OK let's check the PowerShell scripts...hmm, still nothing.
Now that we have those jokes out of the way, lets talk about how Pipelines work. With Pipelines you associate an environment with a Host environment/application, then orchestrate your pipelines afterwards. If your environment is set in an host environment, it can't be added to another host environment until it is deleted. It says it in the documentation.
So we have a way solution and for small organisations with just a few environments this would be a quick fix. Create a Custom Host environment by installing the pipelines application, then create the environments through the UI as "Target" environments. It's important to make them target environments, as Personal Pipelines still seem to work if they are developer environments.
However, what if you have a large/enterprise organisation with dozens, hundreds or perhaps thousands of environments and you want to stop Personal Pipelines. For this we can follow the same process, create a Custom Host environment by installing the pipeline application but then we are going to use Power Automate to automate this process.
There are several ways you can do this, I'm going to cover a couple of them but like with the Power Platform, loads of different ways to achieve this.
First, lets create a reoccurring flow that runs once a day. The reason we are doing this is because we cannot extend the "Platform" environment and have a flow triggered from in there.
Next we need to list some environments. You could take them from your COE if you have the COE start kit or your own COE installed and configured or you could use the Power Platform Admin Connector to list environments. In this instance, lets use the Power Platform Admin Connector, but, as we get to the end of the article, a COE would be a lot better. If you do have a COE installed, you can also trigger this flow from that instead of a scheduled flow.
Now we want to make sure we don't have any duplicates so we list the Deployment Environments from our Custom Host environment. This way we can check what environments we have in the tenant Vs what we have in the host and only add new ones.
I like to use FetchXML on my list rows, here is the code for that:
<fetch version="1.0" mapping="logical" no-lock="false" distinct="true">
<entity name="deploymentenvironment">
<attribute name="statecode"/>
<attribute name="name"/>
<attribute name="createdon"/>
<order attribute="name" descending="false"/>
<attribute name="environmentid"/>
<attribute name="environmenttype"/>
<attribute name="validationstatus"/>
<attribute name="deploymentenvironmentid"/>
<filter type="and">
<condition attribute="statecode" operator="eq" value="0"/>
</filter>
</entity>
</fetch>
Now, one thing we might want to do is create an exclusion list. You may have several Custom Host pipelines setup already and we don't need errors to be caused by adding them to another Custom Host, the purpose of this is to just stop Personal Pipelines. If you have a COE, then I recommend an additional field added to it to exclude environments in there and just use a list rows from that environment with the filter and you are good to go. If you don't have a COE, like in our case, we can create a compose block and add an array of environments to exclude.
What we need to do next is shape the data from Custom Host environment to provide us with an array of just the environment Id's, that way, we can combine the two arrays as they are the same type and just have the Id's rather than messing around with various properties in JSON.
For this, use a Select action, the "From" will be the body/value of the List rows from Custom Host, and for the Map, we will use a little trick. Switch the input from Key value Mode using the toggle, which would allow you to just define the JSON structure yourself, and add the Environment Id into this field as dynamic content.
This automatically just creates an array of the values, no key value pairs required. This is a really cool trick.
Next, and we do this step for ease, we can use a compose action to combine our exclusion list as well as the list we just created. For this we use a Union function. Union is used to combine 2 arrays and remove any duplicates and return a new array. By using this, we combine the list already in the Custom Host, any exclusions into a simple neat array.
union(body('Select_Pipeline_Host_Environment_to_Shape'),
outputs('Compose_Exclusion_list'))
Then we filter one side by the other, we take all the values from the list of environments, and filter where the new Union of an array does not contain the other Id's. This means for all the values of all environments, look at our entire exclusion list (that include Environments already in our Custom Host and the exclusions) and output values of environments that are not in that list, giving us only new environments.
Next we have a condition to check if we need to perform an action. I like using the length() function to see if an array contains data or not.
length(outputs('Filter_existing_env_by_list_env')['body'])
Quick parse JSON step using the output of the filter, making the dynamic content easier to get at.
Use the outputs of the parse JSON in a Add a new row dataverse action and add in the environment ID, set the environment type to "Target" and the display name. Setting the environment type to "Target" is important and this stops others from creating a Pipeline for this environment.
The fields above are "Body name" and "Body displayName". The only thing to mention here is, if you just automatically parse your JSON from sample, you will end up with 3 "Body displayName"-s, so if you are lazy, like me, just make sure you have the one called using this code, which is inside the properties array:
items('Apply_to_each')?['properties']?['displayName']
An apply to each will go around this step, which is fine as it will loop through any new records that need to be created.
And this is the finished flow.
As mentioned, if you have a COE, and can extend that I would. Then you can also create a mechanism to release environments from your customer host by deleting the environment, exclude them in the COE and allow your teams to create them in their Custom Host Environments. Also, this does not account for Dataverse for Teams, if you are using that, you will need to filter those out too.
Alternatively, this is a great way to setup Custom Hosts and orchestrate all pipelines from a single environment.
I hope this helps organisations looking to prevent personal pipelines and starts the conversation about an organisation wide ALM strategy. If the flow above is a little complicated, I'm working on providing this as a sample on my GitHub, let me know if you think it would be useful.
Ciao for now!
MCJ
Comments