Loading...

Bring your own storage to Azure Maps

Bring your own storage to Azure Maps

When working with geospatial data in Azure Maps for geofencing or indoor maps purposes, until today, you needed to upload your data directly to our backend services, where Azure maps would store, process, and make it available. However, if you already use Azure and have data in an existing storage account, you no longer need to upload your data again and create a duplicate in Azure Maps. With the new Azure Maps Bring your own storage services, you can now securely register your existing data and give Azure Maps access. Meaning you have all the control over your data, where it is stored, and how it is encrypted, especially where you have requirements that guarantee that your data stays in a specific geographic location, like in Europe.

 

Give Azure Maps permission to access your data.

The Azure Maps Data Registration service was created to give you more control over your data. Azure Maps Data Registry currently only connects to data stored in Azure Blob Storage, but we will expand this capability to other data sources later. There are three basic steps in the Azure Maps Data Registration process:

 

  1. Create or identify the Azure Storage account that will hold the data. 
  2. Create an Azure Maps datastore and add a link to your Azure Storage account.
  3. Assign roles and permissions to the Azure Maps datastore (by using a managed identity) to access your data.

 

cschotte_0-1678712496527.png

 

Azure Maps only has access to data you have registered using the Azure Maps Data Registration APIs. Access is handled by Role Based Authentication (Role Based Access Control aka RBAC) using a managed identity (system-assigned or user-assigned). When you register your data, Azure Maps does some verification steps and creates a hash code to confirm the data is not modified. Whenever you wish to update your data, you should call the same API to reregister your data and generate a new matching hash code.

 

Azure Maps Registration Service Architecture

In the following architecture diagram, you see two resource groups, one for your Azure Maps account and the second for your application and data. Before Azure Maps has access to your storage account, you need to add a datastore to Azure Maps that references that storage account. You must also assign access rights to a Security Principal (a managed identity) with Contributor and Storage Blob Data Reader privileges. The last step is registering the data inside the storage account to Azure Maps. The Azure Maps Data Registry APIs are documented here.

 

cschotte_1-1678712496532.png

 

When Azure Maps process your data, such as an indoor map using our Azure Maps Creator services, the outcome (a tile set) is still stored in the Azure Maps account and will not be returned to your storage account. The Azure Storage and Azure Maps accounts can only be linked when in identical geographic boundaries, which guarantees that your data will not leave your selected geographic boundaries.

 

So why use Azure Maps Data Registration service? 

The key reasons are owning, locating, and controlling access to the data that is important to you, your product, and your customers. We created this capability based on customer requests to bring their data rather than duplicate it in Azure Maps. Now you can register your data directly with Azure Maps.

 

Did you know you can use the Azure Storage Explorer to upload data to and from a Storage Account?

Published on:

Learn more
Azure Maps articles
Azure Maps articles

Azure Maps articles

Share post:

Related posts

Give your Foundry Agent Custom Tools with MCP Servers on Azure Functions

Learn how to connect your MCP server hosted on Azure Functions to Microsoft Foundry agents. This post covers authentication options and setup ...

1 day ago

Azure Data Factory Tips for Reliable Microsoft Dynamics 365 CE and Dataverse Integrations

Reliable integrations between Microsoft Dynamics 365 Customer Engagement and external systems can become challenging. This is especially true ...

1 day ago

Scalable AI with Azure Cosmos DB: Tredence Intelligent Document Processing (IDP) | March 2026

Azure Cosmos DB enables scalable AI-driven document processing, addressing one of the biggest barriers to operational scale in today’s enterpr...

2 days ago

Announcing the end of support for Node.js 20.x in the Azure SDK for JavaScript

After July 9, 2026, the Azure SDK for JavaScript will no longer support Node.js 20.x. Upgrade to an Active Node.js Long Term Support (LTS) ver...

3 days ago

MCP Apps on Azure Functions: Quickstart with TypeScript

Learn how to build and deploy MCP (Model Context Protocol) apps on Azure Functions using TypeScript. This guide covers MCP tools, resources, l...

3 days ago

Setting up Power BI Version Control with Azure Dev Ops

In this blog post is a way set up version control for Power BI semantic models (and reports) using the PBIP (Power BI Project) format, Azure D...

9 days ago

Azure Developer CLI (azd) – March 2026: Run and Debug AI Agents Locally, GitHub Copilot Integration, & Container App Jobs

Run, invoke, and monitor AI agents locally or in Microsoft Foundry with the new azd AI agent extension commands. Plus GitHub Copilot-powered p...

9 days ago

Writing Azure service-related unit tests with Docker using Spring Cloud Azure

This post shows how to write Azure service-related unit tests with Docker using Spring Cloud Azure. The post Writing Azure service-related uni...

10 days ago

Azure SDK Release (March 2026)

Azure SDK releases every month. In this post, you find this month's highlights and release notes. The post Azure SDK Release (March 2026) appe...

14 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy