Loading...

Tutorial: Blob backup and restore using Azure Backup via Azure CLI

Tutorial: Blob backup and restore using Azure Backup via Azure CLI

Credits: Kartik Pullabhota (Sr. PM for Automation, HANA and Database backup using Azure Backup) for SME input and Swathi Dhanwada (Customer Engineer, Tech community) for testing.

 

Prerequisites 

If you don't already have an Azure subscription, create a free account before you begin. 

 

Azure CLI: 

  • Launch Cloud Shell from top-navigation of the Azure portal. 

cli1.png

 

 

 

 

  • Select a subscription to create a storage account and Microsoft Azure Files share. 
  • Select Create storage. 
  • After creation, check that the environment drop-down from the left-hand side of shell window says Bash.
  •  

Note: Support for Azure Blobs back up and restore via CLI is in preview and available as an extension in Az 2.15.0 version and later. The extension is automatically installed when you run the az dataprotection commands. Learn more about extensions. 

 

Create resource group: 

  • To create a resource group from the Bash session within Cloud Shell, run the following: 

RGNAME= ‘your resource group name 

LOCATION= ‘your location 

az group create --name $RGNAME --location $LOCATION 

  • To retrieve properties of the newly created resource group, run the following: 
    • az group show --name $RGNAME  

Create storage account 

  • Create a general-purpose storage account with the az storage account create command. The general-purpose storage account can be used for all four services: blobs, files, tables, and queues. 

Create storage container 

az storage container create \ 

 --account-name <storage-account> \  

--name <container> \  

--auth-mode login 

 

Create a sample file (blob) and upload to container  

  • To upload a blob to Storage Container, you need “Storage Blob Data Contributor” permissions. The following example uses your Azure AD account to authorize the operation to create the container. Before you create the container, assign the Storage Blob Data Contributor role to yourself. Even if you are the account owner, you need explicit permissions to perform data operations against the storage account. For more information about assigning Azure roles, refer to Assign an Azure role for access to blob data. 

az role assignment create \ 

 --role "Storage Blob Data Contributor" \  

--assignee ”object-id” \  

--scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>" 

 

Note: To retrieve objectid of signed in user, run the following command. 

 

az ad signed-in-user show --query objectId -o tsv 

 

  • To create or open a new file in Bash, execute the “vi” command with file name. For instance, 

vi helloworld

  • When the file opens, press the Insert key and write some content in the file. For instance, type Hello world, then press the Esc key. Next, type :x and then press Enter. 
  • To upload a blob to the container you created in the last step, use the az storage blob upload command.  

az storage blob upload \ 

 --account-name <storage-account> \ 

 --container-name <container> \  

--name helloworld \  

--file helloworld \  

--auth-mode login 

 

  • To check if the blob got uploaded, you can verify with following command. 

az storage blob list \ 

--account-name <storage-account>\  

--container-name <container> \  

 --output table \ 

--auth-mode login 

 

Create backup vault 

az dataprotection backup-vault create -g <rgname> --vault-name <backupvaultname> -l westus --type SystemAssigned --storage-settings datastore-type="VaultStore" type="LocallyRedundant" 

 

Create backup policy for azure blob 

  • Create a protection policy to define when a backup job runs, and how long the recovery points are stored. 

az dataprotection backup-policy get-default-policy-template --datasource-type AzureBlob > BlobPolicy.json 

az dataprotection backup-policy create -g <rgname>  --vault-name <backupvaultname> -n mypolicy --policy BlobPolicy.json 

 

Grant required permissions to backup vault 

  • Operational backup also protects the storage account (that contains the blobs to be protected) from any accidental deletions by applying a Backup-owned Delete Lock. This requires the Backup vault to have certain permissions on the storage accounts that need to be protected. For convenience of use, these minimum permissions have been consolidated under the Storage Account Backup Contributor role. 
  • These can be performed via Portal, PowerShell or CLI 

Configure backup for azure blob 

az dataprotection backup-instance initialize --datasource-type AzureBlob -l southeastasia --policy-id "subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/<rgname>/providers/Microsoft.DataProtection/backupVaults/<backupvaultname>/backupPolicies/BlobBackup-Policy" --datasource-id "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx/resourcegroups/blobrg/providers/Microsoft.Storage/storageAccounts/CLITestSA" > backup_instance.json 

az dataprotection backup-instance create –g  <rgname> --vault-name <backupvaultname> --backup-instance backup_instance.json 

 

Restore Azure Blobs within a storage account 

              az dataprotection backup-instance list --resource-group <rgname> --vault-name <backupvaultname> 

az dataprotection backup-instance show --resource-group <rgname> --vault-name <backupvaultname>  --name <backup-instance-name obtained from previous step>  

 

Initialize Restore operation 

  • As the operational backup for blobs is continuous, there are no distinct points to restore from. Instead, you need to fetch the valid time-range under which blobs can be restored to any point-in-time. To check for valid time-ranges to restore within the last 30 days, you can use the az dataprotection restorable-time-range find command as shown below with the instance ID which was identified in the earlier step. 

az dataprotection restorable-time-range find --start-time 2021-05-30T00:00:00 --end-time 2021-05-31T00:00:00 --source-data-store-type OperationalStore -g <rgname> --vault-name <backupvaultname> --backup-instances <backup instance id retrieved from previous step> 

  • Restoring all the blobs to a point-in-time  

az dataprotection backup-instance restore initialize-for-data-recovery --datasource-type AzureBlob --restore-location southeastasia --source-datastore OperationalStore --target-resource-id "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx/resourcegroups/<rgname>/providers/Microsoft.Storage/storageAccounts/<storage-account-name>" --point-in-time 2021-06-02T18:53:44.4465407Z > restore.json 

  • Restoring selected containers 

az dataprotection backup-instance restore initialize-for-item-recovery --datasource-type AzureBlob --restore-location southeastasia --source-datastore OperationalStore --backup-instance-id "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx/resourceGroups/<rgname>/providers/Microsoft.DataProtection/backupVaults/<backupvaultname>/backupInstances/ <backup instance id retrieved from previous step>" --point-in-time 2021-06-02T18:53:44.4465407Z --container-list container1 container2 > restore.json 

  • Restoring containers using a prefix match 

az dataprotection backup-instance restore initialize-for-item-recovery --datasource-type AzureBlob --restore-location southeastasia --source-datastore OperationalStore --backup-instance-id "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx/resourceGroups/<rgname>/providers/Microsoft.DataProtection/backupVaults/<backupvaultname>/backupInstances/ <backup instance id retrieved from previous step>" --point-in-time 2021-06-02T18:53:44.4465407Z --from-prefix-pattern container1/text1 container2/text4 --to-prefix-pattern container1/text4 container2/text41 > restore.json 

 

Note: The time zone for point-in-time will be taken as UTC 

 

Trigger the restore 

Below command triggers the restore operation 

az dataprotection backup-instance restore trigger -g <rgname> --vault-name <backupvaultname> --backup-instances <backup instance id retrieved from previous step> --restore-request-object restore.json 

 

Track backup jobs  

az dataprotection job list-from-resourcegraph --datasource-type AzureBlob --operation Restore 

 

Additional Resources

 

 

Published on:

Learn more
Azure Storage Blog articles
Azure Storage Blog articles

Azure Storage Blog articles

Share post:

Related posts

Azure Security: Private Vs. Service Endpoints

When connecting securely to a platform service such as a key vault or an Azure storage account, Microsoft recommends using a private endpoint ...

4 hours ago

Give your Foundry Agent Custom Tools with MCP Servers on Azure Functions

Learn how to connect your MCP server hosted on Azure Functions to Microsoft Foundry agents. This post covers authentication options and setup ...

1 day ago

Azure Data Factory Tips for Reliable Microsoft Dynamics 365 CE and Dataverse Integrations

Reliable integrations between Microsoft Dynamics 365 Customer Engagement and external systems can become challenging. This is especially true ...

1 day ago

Scalable AI with Azure Cosmos DB: Tredence Intelligent Document Processing (IDP) | March 2026

Azure Cosmos DB enables scalable AI-driven document processing, addressing one of the biggest barriers to operational scale in today’s enterpr...

2 days ago

Announcing the end of support for Node.js 20.x in the Azure SDK for JavaScript

After July 9, 2026, the Azure SDK for JavaScript will no longer support Node.js 20.x. Upgrade to an Active Node.js Long Term Support (LTS) ver...

3 days ago

MCP Apps on Azure Functions: Quickstart with TypeScript

Learn how to build and deploy MCP (Model Context Protocol) apps on Azure Functions using TypeScript. This guide covers MCP tools, resources, l...

3 days ago

Setting up Power BI Version Control with Azure Dev Ops

In this blog post is a way set up version control for Power BI semantic models (and reports) using the PBIP (Power BI Project) format, Azure D...

9 days ago

Azure Developer CLI (azd) – March 2026: Run and Debug AI Agents Locally, GitHub Copilot Integration, & Container App Jobs

Run, invoke, and monitor AI agents locally or in Microsoft Foundry with the new azd AI agent extension commands. Plus GitHub Copilot-powered p...

10 days ago

Writing Azure service-related unit tests with Docker using Spring Cloud Azure

This post shows how to write Azure service-related unit tests with Docker using Spring Cloud Azure. The post Writing Azure service-related uni...

10 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy