Use python SDK to send and receive events with schema registry in azure event hub
This blog is the complement of another blog which is related to Azure Event Hub Schema Registry. As we known, it’s not supported to migrate Confluent(Kafka) schema registry into Azure schema registry directly. We need to create and manage the scheme registry in the azure event hub separately. The good news is Azure Event Hub supplies multiple client sdks for us to use to serialize and deserialize payloads containing schema registry identifiers and avro-encoded data. In this section, I’d like to share how to use python sdk to send or receive events with schema registry in the Azure Event Hub mostly.
Prerequisites:
1.Create a schema registry group in the event hub portal.
You could refer to this official guidance to create schema registry group.
2.Install required python packages with pip tool
a.pip install azure-schemaregistry-avroencoder
The main package we will use below.
b.pip install azure-identity
Authentication is required to take schema registry group and create schema registry. Hence, we need use TokenCredential protocol of AAD credential.
c.pip install aiohttp
We need installing an async transport to use aysnc API.
3.To implement the TokenCredential authentication flow mentioned above, the following credential types if enabled will be tried in order:
- EnvironmentCredential
- ManagedIdentityCredential
- SharedTokenCacheCredential
- VisualStudioCredential
- VisualStudioCodeCredential
- AzureCliCredential
- AzurePowerShellCredential
- InteractiveBrowserCredential
In test demo below, EnvironmentCredential is used, hence we need to register an AAD application to get tenant id, client id and client secret information.
4.If we want to pass EventData as message type while encoding, we also need to make sure that we have installed azure-eventhub>=5.9.0 to use azure.eventhub.EventData module class.
Test Demo:
1.Client initialization
2.Send event data with encoded content
If we set the auto_register parameter as true, it registers new schemas passed to encode. We can check the created schema registry in event hub portal, the schema name is combined by value of namespace and name property in the definition we set.
If we observe that the created event data, the event text content is encoded as binary. In fact, the avro_encoder.encode method is using underlying BinaryEncoder function to encode/decode the message data.
Once the schema registry is created, we can’t add extra property which is not contained in the standard schema definition.
For example, if we set dictionary content as:
we will encounter following exception:
If we miss required property, it won’t passed through as well. For example, if we set dictionary content as:
we will encounter following exception:
3.Receive event data with decoded content
We could also get sync and async sample code snippets from this link.
Published on:
Learn moreRelated posts
Transforming Field Operations with AI, Azure Maps & Dynamics 365
Efficient field operations are the backbone of successful, data-driven organizations. Yet, many businesses continue to struggle with scattered...
Failures Happen in Cloud, but how Azure Cosmos DB keeps your Applications Online
The only thing that’s constant in distributed systems is failures. No cloud platform is immune to failures — from regional outages and transie...
The `azd` extension to configure GitHub Copilot coding agent integration with Azure
This post shares how to set up the GitHub Copilot coding agent integration with Azure resources and services by using the Azure Developer CLI ...
Announcing Azure MCP Server 1.0.0 Stable Release – A New Era for Agentic Workflows
Today marks a major milestone for agentic development on Azure: the stable release of the Azure MCP Server 1.0! The post Announcing Azure MCP ...
From Backup to Discovery: Veeam’s Search Engine Powered by Azure Cosmos DB
This article was co-authored by Zack Rossman, Staff Software Engineer, Veeam; Ashlie Martinez, Staff Software Engineer, Veeam; and James Nguye...
Azure SDK Release (October 2025)
Azure SDK releases every month. In this post, you'll find this month's highlights and release notes. The post Azure SDK Release (October 2025)...
Microsoft Copilot (Microsoft 365): [Copilot Extensibility] No-Code Publishing for Azure AI Foundry Agents to Microsoft 365 Copilot Agent Store
Developers can now publish Azure AI Foundry Agents directly to the Microsoft 365 Copilot Agent Store with a simplified, no-code experience. Pr...
Azure Marketplace and AppSource: A Unified AI Apps and Agents Marketplace
The Microsoft AI Apps and Agents Marketplace is set to transform how businesses discover, purchase, and deploy AI-powered solutions. This new ...
Episode 413 – Simplifying Azure Files with a new file share-centric management model
Welcome to Episode 413 of the Microsoft Cloud IT Pro Podcast. Microsoft has introduced a new file share-centric management model for Azure Fil...