Loading...

Extracting Microsoft Dataverse Solutions Using YAML Revisited (Azure DevOps)

Extracting Microsoft Dataverse Solutions Using YAML Revisited (Azure DevOps)
Featured image of post Extracting Microsoft Dataverse Solutions Using YAML Revisited (Azure DevOps)

A couple of years ago, I blogged on how to use YAML build pipelines to extract Solutions from the Microsoft Dataverse platform. If you aren’t already fully embracing (perhaps with anger) YAML over “classic” pipelines, this would be something that I’d urge you to start thinking about. YAML pipelines are where Microsoft will be making all future investments from a feature standpoint. Much like our “classic” experience within the Power Platform, we can expect the same end destination for both experiences from an availability standpoint. 😉 Fortunately, Microsoft provide us with a mechanism to support our migration journey into them. And, it probably goes without saying at this juncture, but you should not be using “classic” pipelines for any new work targeting your DevOps environment.

With all that out of the way, let’s turn to the topic of today’s post, namely, revisiting the outline approach suggested in my previous blog post. The post proposes using a YAML pipeline structure similar to the one below:

name: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r)

trigger: none
schedules:
- cron: "0 20 * * *"
  displayName: Daily Build
  branches:
    include:
    - MyArea/MyBranch
  always: true

jobs:
- job: ExtractMySolution
  pool:
    vmImage: 'windows-latest'
  steps:
  - task: PowerPlatformToolInstaller@0
    inputs:
      DefaultVersion: true
  - task: PowerPlatformSetSolutionVersion@0
    inputs:
      authenticationType: 'PowerPlatformEnvironment'
      PowerPlatformEnvironment: 'My Environment'
      SolutionName: 'MySolution'
      SolutionVersionNumber: '1.0.0.$(Build.BuildID)'
  - task: PowerPlatformExportSolution@0
    inputs:
      authenticationType: 'PowerPlatformEnvironment'
      PowerPlatformEnvironment: 'My Environment'
      SolutionName: 'MySolution'
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\MySolution.zip'
      AsyncOperation: true
      MaxAsyncWaitTime: '60'
  - task: PowerPlatformUnpackSolution@0
    inputs:
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\MySolution.zip'
      SolutionTargetFolder: '$(Build.SourcesDirectory)\JJG.MyProject\MySolution'
  - task: CmdLine@2
    inputs:
      script: |
        echo commit all changes
          git config user.email "[email protected]"
          git config user.name "Automatic Build"
          git checkout MyArea/MyBranch
          git add --all
          git commit -m "Latest solution changes."
          echo push code to new repo
          git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin MyArea/MyBranch        

While this will get the job done for us, it does have one glaring issue. What happens if we work with multiple solutions as part of our Dataverse project? Using the example above, we would have to go in and define a pipeline per solution to achieve our end goal. Not exactly an efficient solution, and one which can add to our problems from an overall management standpoint. Surely, it would be far better to be able to do all of this as part of a single pipeline instead?

Thankfully, it turns out that this scenario is entirely possible via YAML pipelines by simply leveraging two specific areas of functionality:

  • Parameters: As their name implies, parameters allows us to dynamically modify the flow of our pipelines based on values we define within them. We have access to several different data types for our parameters, including an object type that we can use to store a list of items. So, for our current purposes, we could use this to hold a list of solution files we want to work with, like so:
parameters:
  - name: solutions
    type: object
    default:
      - DemoSolution
      - AnotherDemoSolution
  • Each Keyword: Although YAML isn’t typically the territory for carrying out complex actions that need to be iterated through, the each keyword does provide us with the ability to execute one or multiple tasks, based on a list of values. So, in this case, we can iterate through each of the values within our solutions parameter object using a snippet similar to this:
  steps:
  - ${{ each solution in parameters.solutions }}:
   #Do stuff here. Reference the current list item by using ${{ solution }}
    - task: CmdLine@2
      inputs:
        script: |
                echo Processing ${{ solution }}...

With knowledge of both of these features, we can then look to make adjustments to the above full YAML file as follows:

name: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r)

trigger: none
schedules:
- cron: "0 20 * * *"
  displayName: Daily Build
  branches:
    include:
    - MyArea/MyBranch
  always: true

parameters:
  - name: solutions
    type: object
    default:
      - DemoSolution
      - AnotherDemoSolution

jobs:
- job: ExtractMySolution
  pool:
    vmImage: 'windows-latest'
  steps:
  - task: PowerPlatformToolInstaller@0
    inputs:
      DefaultVersion: true
  - ${{ each solution in parameters.solutions }}:
    - task: PowerPlatformSetSolutionVersion@0
      inputs:
        authenticationType: 'PowerPlatformEnvironment'
        PowerPlatformEnvironment: 'My Environment'
        SolutionName: '${{ solution }}'
        SolutionVersionNumber: '1.0.0.$(Build.BuildID)'
    - task: PowerPlatformExportSolution@0
      inputs:
        authenticationType: 'PowerPlatformEnvironment'
        PowerPlatformEnvironment: 'My Environment'
        SolutionName: '${{ solution }}'
        SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\${{ solution }}.zip'
        AsyncOperation: true
        MaxAsyncWaitTime: '60'
    - task: PowerPlatformUnpackSolution@0
      inputs:
        SolutionInputFile: '$(Build.ArtifactStagingDirectory)\${{ solution }}.zip'
        SolutionTargetFolder: '$(Build.SourcesDirectory)\JJG.MyProject\${{ solution }}'
  - task: CmdLine@2
    inputs:
      script: |
        echo commit all changes
          git config user.email "[email protected]"
          git config user.name "Automatic Build"
          git checkout MyArea/MyBranch
          git add --all
          git commit -m "Latest solution changes."
          echo push code to new repo
          git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin MyArea/MyBranch        

When your pipeline executes, it will then generate (in this example) six tasks to execute for each of the solution values you feed through. You can adjust the list defined in the YAML to add other solutions in as needed or, what’s even better, you can also override the list when triggering the pipeline manually:

Overall (and if you don’t mind me saying so), I think this alternate approach is better. We’ll rarely be working with just a single solution within Microsoft Dataverse. And, ideally, if we are doing frequent extracts of our solutions into Git, it’s far better (arguably) to do this as part of one fell swoop instead of maintaining several pipelines. Despite some of the difficulties that YAML pipelines can present from an authoring standpoint, it’s hard to deny that it can be quite a powerful tool to use…provided we know what it can do first, of course. 🙂

Published on:

Learn more
The CRM Chap
The CRM Chap

Anything and everything to do with the #PowerPlatform, #MSDYN365, #Azure and more!

Share post:

Related posts

Azure NetApp Files now stores sensitive data DoD IL5 compliant in Azure US Government regions

Table of Contents Introduction Why Azure NetApp Files? DoD IL5 compliance in Azure Government Azure NetApp Files reaches feature parity betwee...

1 day ago

Enhancements to Azure Monitor Baseline Alerts for Azure Landing Zones

Introduction   Welcome to our latest blog post where we dive into a number of exciting new key updates, highlight the new portal accelera...

1 day ago

Feature flags or switches in Power Platform

Power Platform solutions developed by partners normally include some sort of application lifecycle management with separate Dev, Test, Prod en...

1 day ago

Power Platform – Storage Shared Access Signature security enhancements

The Microsoft Power Platform has new security enhancements for Storage Shared Access Signature (SAS), which will be available starting October...

1 day ago

Azure Cosmos DB Vector Search with DiskANN Part 1: Full Space Search

Vector Search with Azure Cosmos DB Azure Cosmos DB NoSQL features advanced vector indexing and search capabilities powered by DiskANN, a suite...

2 days ago

Azure Developer CLI (azd) – September 2024

This post announces the September release of the Azure Developer CLI (`azd`). Including remote container build support, multiple hooks per eve...

2 days ago

IBM Power Virtual Server and Microsoft Azure Multi-cloud Integration Patterns

 IBM Power Virtual Server and Microsoft Azure Multi-cloud Integration Patterns               &nbs...

2 days ago

Azure CLI docker container base Linux image is now Azure Linux

Starting from the version  2.64.0 of Azure CLI, the base Linux distribution of Azure CLI is now Azure Linux.   Impact of the change ...

2 days ago

Enhancing Data Security and Digital Trust in the Cloud using Azure Services.

  Introduction Think of Client-Side Encryption (CSE) as a strategy that has proven to be most effective in augmenting data security and ...

2 days ago

Innovate with AI-Powered Low Code Microsoft’s Power Platform Professional Tools

Navigating Modern Work Challenges with AI in Microsoft’s Power Platform The rapid pace of modern work is overwhelming, with studies revealing ...

2 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy