Build a conversational SMS bot with Azure Communication Services and Azure OpenAI
It is no secret that large language models (LLM) like ChatGPT have been all the rage in the last couple of months. These conversational models offer seamless and intuitive interfaces for users to interact with, enabling them to easily ask questions or carry out tasks. The Azure Communication Services team strives to provide developers with the tools to integrate these conversational entities with communication channels, and delight end users with exceptional support. In this blog we will show you how to use Azure Communication Services SMS capability and Azure OpenAI Service’s GPT-3 model to light up a personalized end user interaction scenario.
In today's blog, we are building an Obi Wan Kanobi conversational bot powered by Azure OpenAI and Azure Communication Services SMS. See the preview:
To follow along, you will need:
- An Azure account with an active subscription. Create an account for free .
- An active Communication Services resource and connection string. Create a Communication Services resource.
- An SMS-enabled telephone number. Get a phone number.
- Enable Event Grid resource provided on your subscription. See instructions.
- Create an Azure OpenAI resource. See instructions.
- Deploy an Azure OpenAI model. See instructions.
You can find the finished code for this blog on GitHub.
This application will leverage Azure Event Grid to listen for incoming text messages to Azure Communication Services number and an Azure Function to process the event and respond with an Azure OpenAI generated response.
We will start by configuring an Azure Function to receive Azure Event Grid events. To create the Azure Function, you can follow instructions to set it up directly on Visual Studio Code. (Ensure for the Azure Function to be of type EventGridTrigger). In this blog, we will jump ahead and show the configured Azure Function.
Next, we will add a call to Azure OpenAI to ask our model to generate a response. We will use REST APIs to POST a request with our prompt. For the prompt, we will use a combination of the message sent by the user and a pre-designed text. In this example, we want the GPT-3 model to act like Obi Wan Kenobi. We added some sample quotes for the model to draw inspiration from. These quotes help guide the model’s response and provide a more intuitive and conversational flow for the users.
We concatenate the prompt with the user’s message.
Finally, we will configure our SMS Client to respond with the new response generated by Azure OpenAI. You will need Azure Communication Services connection string to initialize the SMS Client. You can either paste the connection string directly in the code or place it inside your local.settings.json file in your Azure Function directory under values.
Then we will modify the function itself to add our SMS client logic.
To run the function locally, simply press F5 in Visual Studio Code. We will use ngrok to hook our locally running Azure Function with Azure Event Grid. You will need to download ngrok for your environment. Once the function is running, we will configure ngrok.
Copy the ngrok link provided where your function is running.
Finally, we configure SMS events through Event Grid in your Azure Communication Services resource. We will do this using the Azure CLI . You will need the Azure Communication Services resource ID found in the Azure Portal. (The resource ID will look something like: /subscriptions/<<AZURE SUBSCRIPTION ID>>/resourceGroups/<<RESOURCE GROUP NAME>>/providers/Microsoft.Communication/CommunicationServices/<<RESOURCE NAME>>)
Now that everything is hooked up, test the flow by sending an SMS to the phone number in the Azure Communication Services resource.
Published on:
Learn moreRelated posts
Setting up Power BI Version Control with Azure Dev Ops
In this blog post is a way set up version control for Power BI semantic models (and reports) using the PBIP (Power BI Project) format, Azure D...
Azure Developer CLI (azd) – March 2026: Run and Debug AI Agents Locally, GitHub Copilot Integration, & Container App Jobs
Run, invoke, and monitor AI agents locally or in Microsoft Foundry with the new azd AI agent extension commands. Plus GitHub Copilot-powered p...
Writing Azure service-related unit tests with Docker using Spring Cloud Azure
This post shows how to write Azure service-related unit tests with Docker using Spring Cloud Azure. The post Writing Azure service-related uni...
Azure SDK Release (March 2026)
Azure SDK releases every month. In this post, you find this month's highlights and release notes. The post Azure SDK Release (March 2026) appe...
Specifying client ID and secret when creating an Azure ACS principal via AppRegNew.aspx will be removed
The option to specify client ID and secret when creating Azure ACS principals will be removed. Users must adopt the system-generated client ID...
Azure Developer CLI (azd): Run and test AI agents locally with azd
New azd ai agent run and invoke commands let you start and test AI agents from your terminal—locally or in the cloud. The post Azure Developer...
Microsoft Purview compliance portal: Endpoint DLP classification support for Azure RMS–protected Office documents
Microsoft Purview Endpoint DLP will soon classify Azure RMS–protected Office documents, enabling consistent DLP policy enforcement on encrypte...
Introducing the Azure Cosmos DB Plugin for Cursor
We’re excited to announce the Cursor plugin for Azure Cosmos DB bringing AI-powered database expertise, best practices guidance, and liv...