Azure Communication Services June 2024 Feature Updates
Data Retention with Chat threads
Developers can now create chat threads with a retention policy between 30 and 90 days. This feature is in public preview.
This policy is optional – a developer can choose to create a chat thread with infinite retention (as always) or set a retention policy between 30 and 90 days. If the thread needs to be kept for longer than 90 days, you can extend the time using the update chat thread property API. The policy is geared for data management in organizations that have a need to move data into their archives for historical purposes or delete the data within a given period.
Existing chat threads are not affected by the policy.
For more information, see:
PowerPoint Live
Now in general availability, PPT Live gives both the presenter and audience an inclusive and engaging experience, combining the best parts of presenting in PowerPoint with the connection and collaboration of a Microsoft Teams meeting.
Meeting participants can now view PowerPoint Live sessions initiated by a Teams client using the Azure Communication Services Web UI Library. Participants can follow along with a presentation and view presenter annotations. Developers can use this function via our composites including CallComposite and CallWithChatComposite, as well as through components such as VideoGallery.
For more information, see Introducing PowerPoint Live in Microsoft Teams and Present from PowerPoint Live in Microsoft Teams.
Live Reactions
We have new APIs for end-users to react with emojis during the meeting: like, love, applause, laugh, surprise.
Now generally available, the updated UI library composites and components now include call reactions. The UI Library supports the following list of live call reactions: :thumbs_up: like reaction, <3 heart reaction, :clapping_hands: applause reaction, :face_with_tears_of_joy: laughter reaction, :o surprise reaction.
Call reactions are associated with the participant sending it and are visible to all types of participants (in-tenant, guest, federated, anonymous). Call reactions are supported in all types of calls such as Rooms, groups, and meetings (scheduled, private, channel) of all sizes (small, large, extra-large).
Adding this feature encourages greater engagement within calls, as people can now react in real time without needing to speak or interrupt.
- The ability to have live call reactions added to CallComposite and CallwithChatComposite on web.
- Call reactions added at the component level.
For more information, see Reactions.
Closed Captions
Promote accessibility by displaying text of the audio in video calls. Already available for app-to-Teams calls, this general availability release adds support for closed captions in all app-to-app calls.
For more information, see Closed Captions overview.
You can also learn more about Azure Communication Services interoperability with Teams.
Copilot for Call Diagnostics
AI can help app developers across every step of the development lifecycle: designing, building, and operating. Developers with Microsoft Copilot for Azure (public preview) can use Copilot within Call Diagnostics to understand and resolve a variety of calling issues. For example, developers can ask Copilot questions, such as:
- How do I run network diagnostics in Azure Communication Services VoIP calls?
- How can I optimize my calls for poor network conditions?
- What are the common causes of poor media streams in Azure Communication calls?
- The video on my call didn’t work, how do I fix the subcode 41048?
Developers can use Call Diagnostics to understand call quality and reliability across the organization to deliver a great customer calling experience. Many issues can affect the quality of your calls, such as poor internet connectivity, software compatibility issues, and technical difficulties with devices.
Getting to the root cause of these issues can alleviate potentially frustrating situations for all call participants, whether they're a patient checking in for a doctor’s call, or a student taking a lesson with their teacher. Call Diagnostics enables developers to drill down into the data to identify root problems and find a solution. You can use the built-in visualizations in Azure Portal or connect underlying usage and quality data to your own systems.
For more information, see Call Diagnostics.
You can learn more about these updates and Azure Communication Services Communication Platform as a Service (CPaaS) in the overview.
Published on:
Learn moreRelated posts
Azure Developer CLI (azd) – November 2024
This post announces the November release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) – November 2024 appeared...
Microsoft Purview | Information Protection: Auto-labeling for Microsoft Azure Storage and Azure SQL
Microsoft Purview | Information Protection will soon offer Auto-labeling for Microsoft Azure Storage and Azure SQL, providing automatic l...
5 Proven Benefits of Moving Legacy Platforms to Azure Databricks
With evolving data demands, many organizations are finding that legacy platforms like Teradata, Hadoop, and Exadata no longer meet their needs...
November Patches for Azure DevOps Server
Today we are releasing patches that impact our self-hosted product, Azure DevOps Server. We strongly encourage and recommend that all customer...
Elevate Your Skills with Azure Cosmos DB: Must-Attend Sessions at Ignite 2024
Calling all Azure Cosmos DB enthusiasts: Join us at Microsoft Ignite 2024 to learn all about how we’re empowering the next wave of AI innovati...
Query rewriting for RAG in Azure AI Search
Getting Started with Bicep: Simplifying Infrastructure as Code on Azure
Bicep is an Infrastructure as Code (IaC) language that allows you to declaratively define Azure resources, enabling automated and repeatable d...
How Azure AI Search powers RAG in ChatGPT and global scale apps
Millions of people use Azure AI Search every day without knowing it. You can enable your apps with the same search that enables retrieval-augm...