Perform ExecuteMultiple in batches and changesets from TypeScript using dataverse-ify version 2
My free tutorial course on writing Dataverse web resources has been available for over a year now, and it has had over 1000 people enrol! The course uses version 1 of dataverse-ify
, and of course over that time I've been working on version 2 which is currently available in beta.
What is Dataverse-ify?
Dataverse-ify
aims to simplify calling the Dataverse WebAPI from TypeScript inside model-driven apps, Single Page Applications (SPAs) and integration tests running inside VSCode/Node. It uses a small amount of metadata that is generated using dataverse-gen
and a set of early bound types to make it easy to interact with dataverse tables and columns using a similar API that you might be used to using if you've used the IOrganizationService
inside C#.
You can use the new version by adding @2
on the end of the node modules:
For example:
npx dataverse-auth@2 npx dataverse-gen@2 npm install dataverse-ify@2
Soon, I will be removing the beta tag and publishing it so that it will install by default. There are a few breaking changes detailed in the Upgrading readme, but I will be publishing more samples including a Single Page Application that uses dataverse-ify
even where the Xrm.WebApi
is not available.
I wanted to give you a peek at one of the features that I am really excited about in version 2 - support for ExecuteMultiple
with batch and change set support. A batch allows you to send multiple requests in a single request, and change sets allow you to send multiple requests that will be executed as a transaction. This can give your client-side code a performance boost and make it easier to perform a single changeset where if one request fails, they all will fail. Custom API requests can even be wrapped up in executeMultiple!
Imagine that you have a Command Bar button that calls a JavaScript function from a grid that needs to make an update to a column on all of the selected records, and then wait for a triggered flow to run, as indicated by the updated column being reset. The updates can be wrapped up in an ExecuteMultiple batch rather than being made by lots of Update requests.
Create the top-level function
When a command bar calls a JavaScript function it can return a Promise if there is asynchronous work being performed. In our case, we don't want the model-driven app to wait until our flows are run, so we can use Promise.resolve
on an internal function to 'fire and forget' a long-running task:
static async CreateProjectReportTrigger(entityIds: string[]): Promise<void> { // Fire and forget the internal command so it does not cause a ribbon action timeout Promise.resolve(ProjectRibbon.CreateProjectReportTriggerInternal(entityIds)); }
Create the Internal function and initialize the metadata cache
Inside the internal function, we need to first set our metadata that was created using dataverse-gen
- this provides dataverse-ify
with some of the information, it needs to work out the data types of columns that are not present in the WebApi
responses. We also create a random value to update the column that will trigger flow:
setMetadataCache(metadataCache);jj const requestCount = entityIds.length; const trigger = "trigger" + Math.random().toString();
Make the update using executeMultiple (this is not C# remember, it's TypeScript!)
This is where the magic happens - we can create an array of UpdateRequest
objects using the entitiyIds
provided to function from the Command Bar:
// Trigger the flow for each selected project (using a batch) const updates = entityIds.map((id) => { return { logicalName: "Update", target: { logicalName: dev1_projectMetadata.logicalName, dev1_projectid: id, dev1_reportgenerationstatus: trigger, } as dev1_Project, } as UpdateRequest; }); const serviceClient = new XrmContextDataverseClient(Xrm.WebApi); await serviceClient.executeMultiple(updates);
You can see that the updates array is simply passed into executeMultiple
which then will bundle them up inside a $batch
request. If you wanted to, you can run the updates inside a transaction by simply wrapping the batch inside an array:
await serviceClient.executeMultiple([updates]);
This array could actually contain multiple change sets which each would run independently inside a transaction.
So the resulting function would be:
static async CreateProjectReportTriggerInternal(entityIds: string[]): Promise<void> { // Update a column on the selected records, to trigger a flow try { setMetadataCache(metadataCache); const requestCount = entityIds.length; const trigger = "trigger" + Math.random().toString(); // Trigger the flow for each selected project (using a batch) const updates = entityIds.map((id) => { return { logicalName: "Update", target: { logicalName: dev1_projectMetadata.logicalName, dev1_projectid: id, dev1_reportgenerationstatus: trigger, } as dev1_Project, } as UpdateRequest; }); const serviceClient = new XrmContextDataverseClient(Xrm.WebApi); await serviceClient.executeMultiple(updates); // Monitor the result const query = `<fetch aggregate="true"> <entity name="dev1_project"> <attribute name="dev1_projectid" alias="count_items" aggregate="countcolumn" /> <filter> <condition attribute="dev1_reportgenerationstatus" operator="eq" value="${trigger}" /> </filter> </entity> </fetch>`; let complete = false; do { const inProgressQuery = await serviceClient.retrieveMultiple(query, { returnRawEntities: true }); complete = inProgressQuery.entities.length === 0; if (!complete) { const inProgressCount = inProgressQuery.entities[0]["count_items"] as number; complete = inProgressCount === 0; // Report status Xrm.Utility.showProgressIndicator(`Generating Reports ${requestCount - inProgressCount}/${requestCount}`); await ProjectRibbon.sleepTimeout(2000); } } while (!complete); Xrm.Utility.closeProgressIndicator(); } catch (e) { Xrm.Utility.closeProgressIndicator(); Xrm.Navigation.openErrorDialog({ message: "Could not generate reports", details: JSON.stringify(e) }); } } static sleepTimeout(ms: number): Promise<void> { return new Promise((resolve) => setTimeout(resolve, ms)); }
This code adds in the polling for the number of records that have yet to have the flow run and reset the dev1_reportgenerationstatus
attribute, indicating that it is completed or reports an error.
The batch request would look similar to:
--batch_1665710705198 Content-Type: application/http Content-Transfer-Encoding: binary PATCH /api/data/v9.0/dev1_projects(2361e495-1419-ed11-b83e-000d3a2ae2ee) HTTP/1.1 Content-Type: application/json {"dev1_reportgenerationstatus":"trigger0.39324146578062336","@odata.type":"Microsoft.Dynamics.CRM.dev1_project"} --batch_1665710705198 Content-Type: application/http PATCH /api/data/v9.0/dev1_projects(e8184b63-1823-ed11-b83d-000d3a39d9b6) HTTP/1.1 Content-Type: application/json {"dev1_reportgenerationstatus":"trigger0.39324146578062336","@odata.type":"Microsoft.Dynamics.CRM.dev1_project"} --batch_1665710705198--
The code obviously can be improved by adding a timeout and better reporting of errors - but this shows the general idea of using executeMultiple
using dataverse-ify
version 2.
There are lots of other improvements in version 2 - so if you've used version 1 please do give version 2 a go whilst it's in beta and report any issues inside GitHub.
In my next post on version 2, I'll show you how to call a Custom API using a batch and changeset. If you want to peek before then - take a look at the tests for version 2 - they give lots of examples of its use.
@ScottDurow
Published on:
Learn moreRelated posts
Power Platform admin center – Manage agent security with enhanced admin controls
Update: Release of this feature has been postponed, we will announce a new date in the future. We are announcing the ability to govern Copilot...
Power Platform – Manage system views with security role service update
We are announcing an update to the prerequisites for managing access to public system views with security roles for Power Platform. This updat...
Power Platform – Monitor the health of your resources
We are announcing the ability to monitor the health of your resources in Power Platform admin center and in Power Apps. This feature will reac...
Power Platform & M365 Dev Community Call – July 31st, 2025 – Screenshot Summary
Call Highlights SharePoint Quicklinks: Primary PnP Website: https://aka.ms/m365pnp Documentation & Guidance SharePoint Dev Videos Issues...
Power Platform admin center – Information regarding the end of support for classic PPAC
Starting on September 3, 2025, the classic Power Platform admin center (PPAC) experience will have reached an end of support. The new PPAC off...
Microsoft 365 & Power Platform Call (Microsoft Speakers) – Community Takeover – July 29th, 2025 – Screenshot Summary
Call Highlights SharePoint Quicklinks: Primary PnP Website: https://aka.ms/m365pnp Documentation & Guidance SharePoint Dev Videos Issues...
Power Platform – Assisted mapping for standard dataflows
We are announcing assisted mapping for standard dataflows in Power Platform. This feature will reach general availability on July 31, 2025. Ho...
Why Environment Management Matters for Architects and Developers in Power Platform
Environment Management isn’t just an admin feature—it’s a strategic enabler for solution architects and developers building on Power Platform....
Power Platform admin center – Unlock customer managed key environments by restoring access to the key vault
We are announcing the ability to re-enable locked customer managed key environments in Power Platform admin center. This public preview featur...
Dynamics 365, Power Platform & Copilot 2025 Release Wave 2 Overview
With Microsoft’s upcoming October 2025 updates announced, we highlight the more interesting changes and new features. As with the Wave 1 relea...