Automate your API deployments with Azure API Management and Bicep modules

I think most practitioners can agree that making the jump from ad-hoc code deployments to a continuous integration (CI/CD) deployment model can seem complex. There are so many things that you need to learn - Azure pipelines, Azure Resource Manager (ARM) templates, the Azure CLI, and a whole new system to set up, secure, and monitor for deployments. It was daunting to learn and apply multiple new technologies at the same time when I learned the process.
We've provided various architectures and tools to work with Azure API Management. However, the best tool for deploying a single API is one that is readily available to you - Bicep modules. Bicep is a language that compiles to ARM, so it can deploy anything that ARM can deploy. However, it's easier to read and maintain, mostly because of the declarative nature and the authoring support in Visual Studio Code. A bicep module is a bicep file that describes a part of your infrastructure in a reusable way. You can start by having a library of bicep modules in a git repository that you can just use, or you can publish the bicep modules to a private repository for sharing among multiple projects. There is also a public repository that Microsoft maintains. If you are starting from ARM templates, you can easily decompile the ARM template into bicep, which aids in conversion.
Let's take an example. Let's say that I want to create an API on Azure API Management driven by an OpenAPI specification file. To do this, I have to create an API resource, policy resource and potentially named values and policy fragments that the API policy depends on. I've got an example of this for the petstore API:
A lot of the bicep code is repeatable, which makes it an ideal candidate for modularization. Let's look at an alternative using bicep modules, starting with what I might want to write in bicep:
Instead of referencing a resource, I reference a module to define this API. The module takes certain parameters that describe the API. The OpenAPI specification and policy document for the API are loaded from text files on disk. Now, take a look at the bicep module:
You can (hopefully) see that the bicep module looks remarkably similar to the original API definition. Indeed, I copied the code over to the module and used that as my starting point. I then generalized the module by using parameters for things I wanted to pass in and variables for things I didn't need to pass in because they can be inferred. An example of and inferred value is the content type. When you define an API based on OpenAPI, you need to select between two formats - JSON and YAML. Since all JSON-based specifications start with an open-brace, I used that as the distinguisher between the two.
Obviously, this simple example of a module does not support many of the features that are supported by Azure API Management. However, it can easily be expanded to support those features. I have a version that supports dependent named values and policy fragments, for example, so those are deployed when my API is deployed.
Yes, bicep is one more thing to learn along the way to adopting safe deployment practices with CI/CD pipelines. However, it is a technology that can make your life simpler in the long run by focusing your attention on what matters about the API. Leaving the implementation details to a module ensures that you can focus on those things that change between APIs and drives consistency between deployments.
My final point is that bicep helps you describe cloud applications that rely on multiple Azure services. A typical single page web application written in React using a web API as a backend might use Static Web Apps, API Management, App Service or Functions, KeyVault, Azure Active Directory, and a database like Cosmos DB. You can define this system as a single bicep file, but it will be large. It's better to put every service in its own module and bring them together with another bicep file. This allows you to re-use your service definitions easily. The smaller file sizes provide better readability and maintainability for your infratructure, and the modular code allows you to deploy a part of the infrastructure or the whole infrastructure at once.
Published on:
Learn moreRelated posts
Power Platform Data Export: Track Cloud Flow Usage with Azure Application Insights
In my previous article Power Platform Data Export: Track Power Apps Usage with Azure Data Lake, I explained how to use the Data Export feature...
Announcing General Availability of JavaScript SDK v4 for Azure Cosmos DB
We’re excited to launch version 4 of the Azure Cosmos DB JavaScript SDK! This update delivers major improvements that make it easier and faste...
Confluent Cloud Releases Managed V2 Kafka Connector for Azure Cosmos DB
This article was co-authored by Sudhindra Sheshadrivasan, Staff Product Manager at Confluent. We’re excited to announce the General Availabili...
Now in Public Preview: Azure Functions Trigger for Azure Cosmos DB for MongoDB vCore
The Azure Cosmos DB trigger for Azure Functions is now in public preview—available for C# Azure Functions using Azure Cosmos DB for MongoDB vC...
Now Available: Migrate from RU to vCore for Azure Cosmos DB for MongoDB via Azure Portal
We are thrilled to introduce a cost-effective, simple, and efficient solution for migrating from RU-based Azure Cosmos DB for MongoDB to vCore...
Generally Available: Seamless Migration from Serverless to Provisioned Throughput in Azure Cosmos DB
We are excited to announce the general availability (GA) of a highly requested capability in Azure Cosmos DB: the ability to migrate from serv...
Public Preview: Shape and Control Workloads with Throughput Buckets in Azure Cosmos DB
Imagine your application is processing customer checkouts in real-time, while a background process synchronizes data for reporting. Suddenly, ...
Microsoft Entra ID integration with Azure Cosmos DB for MongoDB (vCore)
Security is no longer a nice-to-have—it’s a foundational requirement for any cloud-native architecture. As organizations adopt managed databas...
How to use the Azure AI Foundry connector in Power Apps
Yesterday I looked at how to use the Azure AI Foundry connector in Power Automate today I'm doing the same within Power Apps, creating a Q...