Build a conversational SMS bot with Azure Communication Services and Azure OpenAI
It is no secret that large language models (LLM) like ChatGPT have been all the rage in the last couple of months. These conversational models offer seamless and intuitive interfaces for users to interact with, enabling them to easily ask questions or carry out tasks. The Azure Communication Services team strives to provide developers with the tools to integrate these conversational entities with communication channels, and delight end users with exceptional support. In this blog we will show you how to use Azure Communication Services SMS capability and Azure OpenAI Service’s GPT-3 model to light up a personalized end user interaction scenario.
In today's blog, we are building an Obi Wan Kanobi conversational bot powered by Azure OpenAI and Azure Communication Services SMS. See the preview:
To follow along, you will need:
- An Azure account with an active subscription. Create an account for free .
- An active Communication Services resource and connection string. Create a Communication Services resource.
- An SMS-enabled telephone number. Get a phone number.
- Enable Event Grid resource provided on your subscription. See instructions.
- Create an Azure OpenAI resource. See instructions.
- Deploy an Azure OpenAI model. See instructions.
You can find the finished code for this blog on GitHub.
This application will leverage Azure Event Grid to listen for incoming text messages to Azure Communication Services number and an Azure Function to process the event and respond with an Azure OpenAI generated response.
We will start by configuring an Azure Function to receive Azure Event Grid events. To create the Azure Function, you can follow instructions to set it up directly on Visual Studio Code. (Ensure for the Azure Function to be of type EventGridTrigger). In this blog, we will jump ahead and show the configured Azure Function.
Next, we will add a call to Azure OpenAI to ask our model to generate a response. We will use REST APIs to POST a request with our prompt. For the prompt, we will use a combination of the message sent by the user and a pre-designed text. In this example, we want the GPT-3 model to act like Obi Wan Kenobi. We added some sample quotes for the model to draw inspiration from. These quotes help guide the model’s response and provide a more intuitive and conversational flow for the users.
We concatenate the prompt with the user’s message.
Finally, we will configure our SMS Client to respond with the new response generated by Azure OpenAI. You will need Azure Communication Services connection string to initialize the SMS Client. You can either paste the connection string directly in the code or place it inside your local.settings.json file in your Azure Function directory under values.
Then we will modify the function itself to add our SMS client logic.
To run the function locally, simply press F5 in Visual Studio Code. We will use ngrok to hook our locally running Azure Function with Azure Event Grid. You will need to download ngrok for your environment. Once the function is running, we will configure ngrok.
Copy the ngrok link provided where your function is running.
Finally, we configure SMS events through Event Grid in your Azure Communication Services resource. We will do this using the Azure CLI . You will need the Azure Communication Services resource ID found in the Azure Portal. (The resource ID will look something like: /subscriptions/<<AZURE SUBSCRIPTION ID>>/resourceGroups/<<RESOURCE GROUP NAME>>/providers/Microsoft.Communication/CommunicationServices/<<RESOURCE NAME>>)
Now that everything is hooked up, test the flow by sending an SMS to the phone number in the Azure Communication Services resource.
Published on:
Learn moreRelated posts
Automating Business PDFs Using Azure Document Intelligence and Power Automate
In today’s data-driven enterprises, critical business information often arrives in the form of PDFs—bank statements, invoices, policy document...
Azure Developer CLI (azd) Dec 2025 – Extensions Enhancements, Foundry Rebranding, and Azure Pipelines Improvements
This post announces the December release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) Dec 2025 – Extensions En...
Unlock the power of distributed graph databases with JanusGraph and Azure Apache Cassandra
Connecting the Dots: How Graph Databases Drive Innovation In today’s data-rich world, organizations face challenges that go beyond simple tabl...
Azure Boards integration with GitHub Copilot
A few months ago we introduced the Azure Boards integration with GitHub Copilot in private preview. The goal was simple: allow teams to take a...
Microsoft Dataverse – Monitor batch workloads with Azure Monitor Application Insights
We are announcing the ability to monitor batch workload telemetry in Azure Monitor Application Insights for finance and operations apps in Mic...
Copilot Studio: Connect An Azure SQL Database As Knowledge
Copilot Studio can connect to an Azure SQL database and use its structured data as ... The post Copilot Studio: Connect An Azure SQL Database ...
Retirement of Global Personal Access Tokens in Azure DevOps
In the new year, we’ll be retiring the Global Personal Access Token (PAT) type in Azure DevOps. Global PATs allow users to authenticate across...
Azure Cosmos DB vNext Emulator: Query and Observability Enhancements
The Azure Cosmos DB Linux-based vNext emulator (preview) is a local version of the Azure Cosmos DB service that runs as a Docker container on ...