Loading...

How to migrate all Azure Storage Queue data between two different Storage Accounts with Python

How to migrate all Azure Storage Queue data between two different Storage Accounts with Python

Background

This article describes how to migrate all Azure Storage Queues data between two different storage accounts.

 

For this, we will use Azure Storage SDK for Python to copy all queues (and the respective data) from one Azure Storage Queue to another Azure Storage Queue. This approach will keep the data in the source queues, and will create new queues with the respective data in the destination Azure Storage Queue.

 

This script was developed and tested using the following versions but it is expected to work with previous versions:

  • Python 3.11.7
  • azure-identity (version: 1.15.0)
  • azure-storage-queue (version: 12.9.0)


Approach

 

In this section, you can find a sample code to copy all queues data between two Storage Accounts using the Azure Storage SDK for Python.

 

This Python sample code is based on Azure Storage SDK for Python. Please review our documentation here Quickstart: Azure Queue Storage client library for Python.

 

Prerequisites

 

Download or use any Python IDE of your choice.

  • On Python side, we will use the following packages:
    • azure-identity (more information here azure-identity · PyPI). To install, please run:
      pip install azure-identity​
    • azure-storage-queue (more information here azure-storage-queue · PyPI). To install, please run:
      pip install azure-storage-queue​​

 

Please see below the sample code to copy all the queues data between two Azure Storage Accounts using the storage connection string.

 

import os from azure.identity import DefaultAzureCredential from azure.storage.queue import QueueServiceClient, QueueClient, QueueMessage try: # Connect to the source and target storage accounts source_connection_string = "XXXX" target_connection_string = "XXXX" # Create a QueueServiceClient for both source and target account)s source_client = QueueServiceClient.from_connection_string(source_connection_string) target_client = QueueServiceClient.from_connection_string(target_connection_string) # List all queues from the source account for queue in source_client.list_queues(): print(queue) # Create the same queue in the target account target_queue_client = target_client.create_queue(queue.name) # Read messages from the source queue for message in source_client.get_queue_client(queue.name).receive_messages(): # Add the message to the target queue target_queue_client.send_message(message.content) print("Data migration completed successfully!") except Exception as ex: print("Exception:") print(ex)

 

 

After executing this sample code, it is expected that you will find all the queues from the source Storage Account in the destination Storage Account, as well as the data/messages from those queues.

 

Disclaimer:

  • These steps are provided for the purpose of illustration only. 
  • These steps and any related information are provided "as is" without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.
  • We grant You a nonexclusive, royalty-free right to use and modify the Steps and to reproduce and distribute the steps, provided that. You agree:
    • to not use Our name, logo, or trademarks to market Your software product in which the steps are embedded;
    • to include a valid copyright notice on Your software product in which the steps are embedded; and
    • to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of steps.

 

 

Published on:

Learn more
Azure PaaS Blog articles
Azure PaaS Blog articles

Azure PaaS Blog articles

Share post:

Related posts

Azure Data Factory and Databricks Lakeflow: An Architectural Evolution in Modern Data Platforms

As data platforms evolve, the role of orchestration is being quietly reexamined. This article explores how Azure Data Factory and Databricks L...

3 hours ago

Part 2: Building a Python CRUD API with Azure Functions and Azure Cosmos DB

Series: Building Serverless Applications with Azure Functions and Azure Cosmos DB In the first post of this series, we focused on establishing...

16 hours ago

Azure Cosmos DB Data Explorer now supports Dark Mode

If you spend time in the Azure Portal’s using Azure Cosmos DB Data Explorer, you know it’s a “lots of screens, lots of tabs, lots of work happ...

1 day ago

Microsoft Entra ID Governance: Azure subscription required to continue using guest governance features

Starting January 30, 2026, Microsoft Entra ID Governance requires tenants to link an Azure subscription to use guest governance features. With...

3 days ago

Azure Developer CLI (azd) – January 2026: Configuration & Performance

This post announces the January 2026 release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) – January 2026: Conf...

4 days ago

Azure SDK Release (January 2026)

Azure SDK releases every month. In this post, you'll find this month's highlights and release notes. The post Azure SDK Release (January 2026)...

5 days ago

Azure Cosmos DB TV Recap – From Burger to Bots – Agentic Apps with Cosmos DB and LangChain.js | Ep. 111

In Episode 111 of Azure Cosmos DB TV, host Mark Brown is joined by Yohan Lasorsa to explore how developers can build agent-powered application...

5 days ago

Accelerate Your Cosmos DB Infrastructure with GitHub Copilot CLI and Azure Cosmos DB Agent Kit

Modern infrastructure work is increasingly agent driven, but only if your AI actually understands the platform you’re deploying. This guide sh...

6 days ago

Accelerate Your Cosmos DB Infrastructure with GitHub Copilot CLI and Azure Cosmos DB Agent Kit

Modern infrastructure work is increasingly agent driven, but only if your AI actually understands the platform you’re deploying. This guide sh...

6 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy