Loading...

Utilizing Azure Storage and Runbooks for scheduled automated backups of Azure SQL Databases

Utilizing Azure Storage and Runbooks for scheduled automated backups of Azure SQL Databases

In this article, we are going to provide detailed steps to create a scheduled Azure SQL Database backup to storage account using automation. This is a useful technique for maintaining regular backups of your database and storing them in a secure and accessible location. You will get an actual backup of Azure SQL Database stored in a storage account in .bacpac format, which you can restore or migrate as needed. The automation process involves creating an automation account that triggers a PowerShell script through a runbook to run the backup command and save the output to a blob container.

 

Prerequisites

  1. Azure Storage account: The storage account is needed to host the database backups. You are required to set up a container within this account to store the backups.

 

  1. Azure SQL Database and Server: This is the database you will back up, along with its hosting server. Ensure that you grant Azure services permission to access the server where this database is hosted by selecting "Allow Azure services and resources to access this server" within the networking section, as illustrated in the following documentation: Network Access Controls - Azure SQL Database & Azure Synapse Analytics | Microsoft Learn

 

  1. Azure Automation account and PowerShell Workflow runbook: These components are utilized to set up automatic backups and their scheduling. make sure to using a PowerShell version 7.2 or above when creating the runbook. To learn more about setting up a PowerShell Workflow runbook in Azure Automation, please check the following documentation: Tutorial - Create a PowerShell Workflow runbook in Azure Automation | Microsoft Learn

Setup

Once all the necessary prerequisites are in place, you should navigate to the runbook and click on "Edit." Then choose the "Edit in portal" option as illustrated below:

 

hqaffesha_0-1722506660955.png

 

 The editor interface for the runbook will launch, displaying the runbook you created earlier. Enter the following code into the editor area:

 

 

 

# Connect to Azure with system-assigned managed identity Connect-AzAccount -Identity # set and store context $AzureContext = Set-AzContext –SubscriptionId "*****" # Resource group name $resourceGroup = "*****" # Storage account name that will have the backups $storageAccountName = "*****" # Storage account access key that will have the backups $storageKey = "*****” # Container name that will have the backups $containerName = "*****" # storage blob uri with the datetime $storageUri = "https:// *****.blob.core.windows.net/*****/db-$(Get-Date -UFormat "%Y-%m-%d_%H-%m-%S").bacpac" #Storage access key type $storageKeyType = "StorageAccessKey" #SQL server name $server_name ="*****" #Database name to be exported $SQL_db = "*****" #SQL Auth Username $SQL_username = "*****" #SQL Auth Password $SQL_secure_secret = ConvertTo-SecureString -String "*****" -AsPlainText -Force # Run the Export job with the required parameters New-AzSqlDatabaseExport -ResourceGroupName $resourceGroup -ServerName $server_name -DatabaseName $SQL_db -StorageKeyType $storageKeyType -StorageKey $storageKey -StorageUri $storageUri -AdministratorLogin $SQL_username -AdministratorLoginPassword $SQL_secure_secret

 

 

 

 

Ensure you fill in all the ***** with the appropriate values for the step you've designed, then press the "Save" button located at the top left corner. The parameters given relate to details on subscription and resource group, as well as information on the SQL database, server, and storage account. You'll find explanations for each parameter in the code snippet provided above. 

 

If you have addition security requirement, you can store the secrets used in the script within an Azure Key Vault and retrieve them as needed within the PowerShell script. This approach ensures that sensitive information is securely managed and reduces the risk of exposure.

 

Select the test button on the toolbar, which will display the test pane allowing you to test the script prior to scheduling. Hit the start button and monitor the requests; if the run is successful, you should observe an outcome like this:

 

hqaffesha_0-1723372283914.png

 

 

 

You can monitor the progress of backups and access the export history through the SQL server in the "Import/Export history" section, as illustrated below:

 

hqaffesha_2-1722506735341.png

 

 

After the backup finishes, it will appear in the storage account with the datetime suffix provided by the script:

 

hqaffesha_3-1722506735343.png

 

 

After verifying the script operates correctly, you may go ahead with publishing the runbook and link it to a schedule based on your requirements according to the following documentation:

Manage schedules in Azure Automation | Microsoft Learn

 

You can initiate the automation by pressing the start button, schedule it to run automatically, or set a webhook to trigger the process. Remember that you can only run one export job at a time. If you attempt to run two jobs simultaneously, one will not succeed and you'll receive an error message stating that another export job is already in progress.

 

Recommendations

Establishing a routine backup schedule for a large database over an undefined timeframe can lead to substantial storage consumption and potentially significant costs. It's important to regularly monitor your storage account and remove unnecessary backups, or consider relocating them to more cost-effective storage tiers, such as cold or archive.

 

It might be beneficial to consider implementing a storage lifecycle management policy to manage data within the storage account and decrease the costs associated with storing database backups. Lifecycle management can help you in creating an automated schedule to delete blobs or transition blobs to a different tier that is less expensive like cold or archive tiers based on creation date. For additional details on storage lifecycle management and instructions for configuration, please consult the provided documentation:

Configure a lifecycle management policy - Azure Blob Storage | Microsoft Learn

 

If you have soft delete enabled on the storage account, make sure that the retention period is set appropriately to avoid incurring additional charges for retaining soft-deleted data over an extended period.

 

Published on:

Learn more
Azure PaaS Blog articles
Azure PaaS Blog articles

Azure PaaS Blog articles

Share post:

Related posts

Setting up Power BI Version Control with Azure Dev Ops

In this blog post is a way set up version control for Power BI semantic models (and reports) using the PBIP (Power BI Project) format, Azure D...

2 days ago

Azure Developer CLI (azd) – March 2026: Run and Debug AI Agents Locally, GitHub Copilot Integration, & Container App Jobs

Run, invoke, and monitor AI agents locally or in Microsoft Foundry with the new azd AI agent extension commands. Plus GitHub Copilot-powered p...

2 days ago

Writing Azure service-related unit tests with Docker using Spring Cloud Azure

This post shows how to write Azure service-related unit tests with Docker using Spring Cloud Azure. The post Writing Azure service-related uni...

3 days ago

Azure SDK Release (March 2026)

Azure SDK releases every month. In this post, you find this month's highlights and release notes. The post Azure SDK Release (March 2026) appe...

7 days ago

Specifying client ID and secret when creating an Azure ACS principal via AppRegNew.aspx will be removed

The option to specify client ID and secret when creating Azure ACS principals will be removed. Users must adopt the system-generated client ID...

7 days ago

Azure Developer CLI (azd): Run and test AI agents locally with azd

New azd ai agent run and invoke commands let you start and test AI agents from your terminal—locally or in the cloud. The post Azure Developer...

15 days ago

Microsoft Purview compliance portal: Endpoint DLP classification support for Azure RMS–protected Office documents

Microsoft Purview Endpoint DLP will soon classify Azure RMS–protected Office documents, enabling consistent DLP policy enforcement on encrypte...

15 days ago

Introducing the Azure Cosmos DB Plugin for Cursor

We’re excited to announce the Cursor plugin for Azure Cosmos DB bringing AI-powered database expertise, best practices guidance, and liv...

16 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy