Loading...

Create ADF Events trigger that runs an ADF pipeline in response to Azure Storage events.

Create ADF Events trigger that runs an ADF pipeline in response to Azure Storage events.

Storage Event Trigger in Azure Data Factory is the building block to build an event-driven ETL/ELT architecture (EDA). Data Factory's native integration with Azure Event Grid let you trigger processing pipeline based upon certain events. Currently, Storage Event Triggers support events with Azure Data Lake Storage Gen2 and General Purpose version 2 storage accounts, including Blob Created and Blob Deleted.

 

Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require customers to trigger pipelines based on events happening in storage account, such as the arrival or deletion of a file in Azure Blob Storage account. Data Factory and Synapse pipelines natively integrate with Azure Event Grid, which lets you trigger pipelines on such events.

 

This blog demonstrates how we can use ADF triggers for running the ADF pipeline in events of Azure Storage events.

 

Prerequisites:

  • An ADLS Gen2 storage account or GPv2 Blob Storage Account
    Create a storage account - Azure Storage | Microsoft Docs
  • The integration described in this article depends on Azure Event Grid. Make sure that your subscription is registered with the Event Grid resource provider. For more info, see Resource providers and types. You must be able to do the Microsoft.EventGrid/eventSubscriptions/* action. This action is part of the EventGrid EventSubscription Contributor built-in role.

    To do so, the Resource Provider 'Microsoft.EventGrid' needs to be registered in the Subscription as per the below screenshot: 

RidhimaSinha_0-1652416307749.png

 

  • If the blob storage account resides behind a private endpoint and blocks public network access, you need to configure network rules to allow communications from blob storage to Azure Event Grid. You can either grant storage access to trusted Azure services, such as Event Grid, following Storage documentation, or configure private endpoints for Event Grid that map to VNet address space, following Event Grid documentation
     
  • The Storage Event Trigger currently supports only Azure Data Lake Storage Gen2 and General-purpose version 2 storage accounts.

 

  • To create a new or modify an existing Storage Event Trigger, the Azure account used to log into the service and publish the storage event trigger must have appropriate role based access control (Azure RBAC) permission on the storage account

  • Service Principal for the Azure Data Factory does not need special permission to either the Storage account or Event Grid

 

Demo:                                    

 

Step 1:

Create ADF resource on Azure Portal.  If you are new to ADF, please refer this link on how to create one:
Create an Azure data factory using the Azure Data Factory UI - Azure Data Factory | Microsoft Docs

 

Step 2:

Once Data Factory is created, navigate to Azure Data Factory Studio present in the Overview section:

 

RidhimaSinha_1-1652416307755.png

 

Step 3:

As we land on the ADF portal, Create Linked service for storage account in ADF Portal as per the below screenshots:

RidhimaSinha_2-1652416307759.png

 

Once you click on ‘+New’, we need to first select the data source. If you are using GPv2 Blob Storage account use ‘Azure Blob Storage’ and if you are working with ADLS Gen2 account use ‘Azure Data Lake Storage Gen2’. I’ve used Gen2 in this demo.

RidhimaSinha_3-1652416307761.png

After selecting the Data Store fill in required details as below:

 

RidhimaSinha_4-1652416307764.png

 

Once the Test Connection is successful, click on ‘Create. This will create the storage account linked service.

 

Step 4:

Creating Input and Output Datasets

In this demo, we will create a simple ADF pipeline that will copy an ‘emp.txt’ file from one folder ‘input’ to another folder ‘output’ within a container. Hence, we need input and output datasets in ADF that maps to the blobs in input and output folder. So let’s create InputDataset and OutputDataset:

 

Go to ‘Author’ on ADF portal and click on ‘New Dataset’ as per below screenshot:

 

RidhimaSinha_5-1652416307766.png

 

 

RidhimaSinha_6-1652416307769.png

 

 

RidhimaSinha_7-1652416307771.png

 

 

RidhimaSinha_8-1652416307772.png

 

Then click on ‘Ok’.

Similarly, you can create OutputDataset as below:

RidhimaSinha_9-1652416307773.png 

 

Step 5:

Create the ADF pipeline to copy data from ‘input’ to ‘output’ folder as per the below screenshots:

 

RidhimaSinha_10-1652416307776.png

 

Give a pipeline name and drag ‘Copy Data’ activity to the designer surface. Name the activity:

 

RidhimaSinha_11-1652416307781.png

 

Select Source and Sink as below:

RidhimaSinha_12-1652416307781.png

 

 

RidhimaSinha_13-1652416307782.png

 

Now ‘Validate’ the pipeline and ‘Debug’ to check whether it works as expected.

 

RidhimaSinha_14-1652416307782.png

Step 6:

Once the pipeline is validated, let’s Create BlobCreated event Trigger as per below screenshot:

 

RidhimaSinha_15-1652416307784.png

 

Choose Trigger--> New:

 

RidhimaSinha_16-1652416307788.png

 

After clicking on ‘Continue’, you will get ‘Data Preview’.  This shows the blobs that matches the event trigger filters thus you can verify whether the filter is correct or not. Click ‘Continue’ and you will see ‘Parameters’ section. This is helpful when you want to pass any parameters to the pipeline. Skip this as we are not using parameters in this demo and click ‘Ok’.

 

Now we have all the components in place and next step would be to ‘Publish’ all the changes.

RidhimaSinha_17-1652416307791.png

 

Once publish is completed, let’s test the trigger.

 

Upload file ‘emp.txt’ to input folder and this should fire the BlobCreated event thus firing ADF trigger.

 

RidhimaSinha_18-1652416307793.png

 

File copied to output folder:

 

RidhimaSinha_19-1652416307795.png

 

ADF Trigger run:

 

RidhimaSinha_20-1652416307797.png

 

Pipeline run:

 

RidhimaSinha_21-1652416307800.png

 

As we see from the result screenshots above, the BlobCreated trigger works as expected and runs the attached ADF pipeline.

 

Similarly, BlobDeleted event can be created.

 

Reference link:

Create event-based triggers - Azure Data Factory & Azure Synapse | Microsoft Docs

 

Hope this helps!

 

Published on:

Learn more
Azure PaaS Blog articles
Azure PaaS Blog articles

Azure PaaS Blog articles

Share post:

Related posts

Bringing Context to Copilot: Azure Cosmos DB Best Practices, Right in Your VS Code Workspace

Developers love GitHub Copilot for its instant, intelligent code suggestions. But what if those suggestions could also reflect your specific d...

4 hours ago

Build an AI Agentic RAG search application with React, SQL Azure and Azure Static Web Apps

Introduction Leveraging OpenAI for semantic searches on structured databases like Azure SQL enhances search accuracy and context-awareness, pr...

8 hours ago

Announcing latest Azure Cosmos DB Python SDK: Powering the Future of AI with OpenAI

We’re thrilled to announce the stable release of Azure Cosmos DB Python SDK version 4.14.0! This release brings together months of innov...

2 days ago

How Azure CLI handles your tokens and what you might be ignoring

Running az login feels like magic. A browser pops up, you pick an account, and from then on, everything just works. No more passwords, no more...

3 days ago

Boost your Azure Cosmos DB Efficiency with Azure Advisor Insights

Azure Cosmos DB is Microsoft’s globally distributed, multi-model database service, trusted for mission-critical workloads that demand high ava...

5 days ago

Microsoft Azure Fundamentals #5: Complex Error Handling Patterns for High-Volume Microsoft Dataverse Integrations in Azure

🚀 1. Problem Context When integrating Microsoft Dataverse with Azure services (e.g., Azure Service Bus, Azure Functions, Logic Apps, Azure SQ...

5 days ago

Using the Secret Management PowerShell Module with Azure Key Vault and Azure Automation

Automation account credential resources are the easiest way to manage credentials for Azure Automation runbooks. The Secret Management module ...

6 days ago

Microsoft Azure Fundamentals #4: Azure Service Bus Topics and Subscriptions for multi-system CRM workflows in Microsoft Dataverse / Dynamics 365

🚀 1. Scenario Overview In modern enterprise environments, a single business event in Microsoft Dataverse (CRM) can trigger workflows across m...

6 days ago

Easily connect AI workloads to Azure Blob Storage with adlfs

Microsoft works with the fsspec open-source community to enhance adlfs. This update delivers faster file operations and improved reliability f...

7 days ago

Microsoft Azure Fundamentals #3: Maximizing Event-Driven Architecture in Microsoft Power Platform

🧩 1. Overview Event-driven architecture (EDA) transforms how systems communicate.Instead of traditional request–response or batch integration...

7 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy