Loading...

How to copy all Azure Storage Tables data between two different Storage Accounts with Python

How to copy all Azure Storage Tables data between two different Storage Accounts with Python

Background

This article describes how to copy all Azure Storage Tables data between two different storage accounts.

 

For this, we will use Azure Storage SDK for Python to copy all tables (and the respective data) from one Azure Storage Table to another Azure Storage Table. This approach will keep the data in the source tables, and will create new tables with the respective data in the destination Azure Storage Table.

 

This script was developed and tested using the following versions but it is expected to work with previous versions:

  • Python 3.11.7
  • azure-data-tables (version: 12.5.0)
  • azure-core (version: 1.30.1)

 

Approach

 

In this section, you can find a sample code to copy all tables data between two Storage Accounts using the Azure Storage SDK for Python.

 

This Python sample code is based on Azure Storage SDK for Python. Please review our documentation here Azure Tables client library for Python | Microsoft Learn

 

Prerequisites

 

Download or use any Python IDE of your choice.

  • On Python side, we will use the following packages:
    • azure-data-tables (more information here azure-data-tables · PyPI). To install, please run:
      pip install azure-data-tables
    • azure-core (more information here azure-core · PyPI). To install, please run:
      pip install azure-core

 

Please see below the sample code to copy all the tables data between two Azure Storage Accounts using the storage connection string.

 

Special note: Only tables that do not exist with the same name in the destination Storage Account will be copied.

 

 

from azure.data.tables import TableServiceClient from azure.core.exceptions import ResourceExistsError source_connection_string = "X" destination_connection_string = "X" # Create a TableServiceClient for both source and destination accounts source_table_service = TableServiceClient.from_connection_string(conn_str=source_connection_string) destination_table_service = TableServiceClient.from_connection_string(conn_str=destination_connection_string) for table in source_table_service.list_tables(): source_table_client = source_table_service.get_table_client(table_name=table.name) destination_table_client = destination_table_service.get_table_client(table_name=table.name) try: # Create destination table if it does not exist destination_table_client.create_table() # Fetch entities from the source table entities = source_table_client.list_entities() # Insert entities into the destination table for entity in entities: destination_table_client.create_entity(entity=entity) print(f"Table '{table.name}' copied") except ResourceExistsError: print(f"Table '{table.name}' already exists.")

 

 

After executing this sample code, it is expected that you will find all the tables from the source Storage Account in the destination Storage Account, as well as the data from those tables.

 

Disclaimer:

  • These steps are provided for the purpose of illustration only. 
  • These steps and any related information are provided "as is" without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.
  • We grant You a nonexclusive, royalty-free right to use and modify the Steps and to reproduce and distribute the steps, provided that. You agree:
    • to not use Our name, logo, or trademarks to market Your software product in which the steps are embedded;
    • to include a valid copyright notice on Your software product in which the steps are embedded; and
    • to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of steps.

Published on:

Learn more
Azure PaaS Blog articles
Azure PaaS Blog articles

Azure PaaS Blog articles

Share post:

Related posts

Announcing Azure Cosmos DB Integration with LangChain.js!

Announcing Azure Cosmos DB Vector Store Integration with LangChain.js! We’re simplifying AI app development by integrating Azure Cosmos ...

1 day ago

Scale Your Database Workloads with Multishard Clusters in vCore-based Azure Cosmos DB for MongoDB

We’re excited to introduce significant enhancements to vCore-based Azure Cosmos DB for MongoDB with the release of multishard clusters preview...

1 day ago

Transform Your Azure Container Apps with Bulletproof Security

In this post, we explore how to transform your Azure Container Apps with unshakable security. Learn how to master secrets management, optimize...

2 days ago

Optimize Azure Landing Zone with Azure Virtual Network Manager IP Address Management

Optimize Azure Landing Zone with Azure Virtual Network Manager IP Address Management What you will learn from this blog This blog explores how...

2 days ago

Announcing UNLIMITED Public Preview of Metadata Caching for Azure Premium SMB/REST File Shares

Azure Files is excited to announce the Unlimited public preview of Metadata Caching for the premium SMB/REST file share tier.  Unlimited ...

2 days ago

Dominate your industry and boost performance using Azure AI tools and skilling

As AI technologies continue to rapidly evolve, developers have the exciting opportunity to stay ahead by continually learning and adapting. Wi...

3 days ago

Azure DevOps – EPICS vs FEATURES vs USER STORIES vs Tasks vs Bugs

Today, we’re diving into Epics, Features, User Stories, Tasks and Bugs and the main differences between them. You will learn when to use...

3 days ago

Discover the New Azure Cosmos DB Samples Gallery!

We are thrilled to introduce the Azure Cosmos DB Samples Gallery —your ultimate destination for top-tier Azure Cosmos DB samples, technical gu...

3 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy