Automate Dataverse solution deployment with Azure Pipelines

In previous Azure DevOps related posts we looked at: ✅ How to configure the connection between an Azure DevOps project for Pipelines usage and your Power Platform environments ✅ How to build a pipeline to commit your solution to a repository In this post,… READ MORE [https://lewisdoes.dev/blog/a
Automate Dataverse solution deployment with Azure Pipelines
exc-62b9925c22619543dc2ab57d
In: Low Code Lewis Content 🚀

In previous Azure DevOps related posts we looked at:

✅ How to configure the connection between an Azure DevOps project for Pipelines usage and your Power Platform environments

✅ How to build a pipeline to commit your solution to a repository

In this post, we’re going to look at the next step, being a new pipeline. This pipeline is pretty simple, and allows you to export a solution from one environment, import it to another environment such as test or production as managed, and then publish your customisations!

It’s key to be able to follow this post, that you’ve configured the service connection between your DevOps environment and Power Platform environment. If you haven’t done that, check out this post. You also need to have Power Platform build tools installed in DevOps.

Creating your pipeline

So, first we need to go to our Azure DevOps project and go to ‘pipelines’. We need to create a new pipeline and then you can select to ‘use the classic editor to create a pipeline without YAML’.

Now we need to tell Azure Pipelines where our source/repository is. In my case I’m going to configure my pipeline with a GitHub repo, but you can use an Azure Repo or another type to suit your needs. Using an Azure Repo might be the easiest option if you don’t know how to make the connection with another type of repository. Once you’ve configured your repository, select continue.

Now we need to select a template or start with an empty job to build out our pipeline. In this case, we’re going to select to use an ‘empty job’.

Now we can start adding our tasks to agent job 1 to build out our pipeline.

We want to add the following tasks to our agent job:

✅ Power Platform Tool Installer

✅ Power Platform Export Solution

✅ Power Platform Import Solution

✅ Power Platform Publish Customizations

Now we need to go to the variables tab of our pipeline edit window and add a new ‘pipeline variable’. We’re going to give this variable the name ‘SolutionName’ and set the value to the name of our solution in our Dataverse environment. Now we can head back to our tasks tab and start to configure our added tasks.

Power Platform Tool Installer

We don’t need to do any configuration when it comes to this step, we simply need to add it and move onto the next! An easy start!

Power Platform Export Solution

This is the step where we’re going to fetch our solution from our Power Platform environment, which is most likely a development environment, so that we can use it to import into our target environment. For this step we’re going to use a ‘Service Principal/client secret’ authentication type. If you haven’t followed the steps in the first post on Azure DevOps linked at the top of this post, you need to do this first to have a service connection present to use.

Once we’ve selected the option to use a service connection, we need to select the service connection we have configured to make a connection to our Power Platform environment.

For the solution name field we are going to use the variable we created, so simply fill in the value as $(SolutionName)

For the solution output file we’re going to use the following value: $(Build.ArtifactStagingDirectory)\$(SolutionName).zip

We need to check the box to export as a managed solution if we’re importing this solution in a test or production environment. If you’re just moving the solution to another development environment or source control, you can leave this unchecked to export as unmanaged.

Power Platform Import Solution

This step is fairly similar to the last except we’re now importing the solution we’ve just exported to our target environment. We need to select the service connection of the environment we want to import out solution to and use the following value in out solution input file field.

$(Build.ArtifactStagingDirectory)\$(SolutionName).zip

Power Platform Publish Customizations

This step is a more simple one like the first. We simply need to select the service connection of the environment we’re importing our solution to. This step will publish all the solution XML changes in the environment.

There you have it! Save your pipeline and test it out. That’s how you create a pipeline to automate the deployment of a dataverse solution from one environment to another!

I hope this helps.

Written by
Lewis Baybutt
Microsoft Business Applications MVP • Power Platform Consultant • Blogger • Community Contributor • #CommunityRocks • #SharingIsCaring
Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to LewisDoesDev.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.