Loading...

Utilizing Azure Storage and Runbooks for scheduled automated backups of Azure SQL Databases

Utilizing Azure Storage and Runbooks for scheduled automated backups of Azure SQL Databases

In this article, we are going to provide detailed steps to create a scheduled Azure SQL Database backup to storage account using automation. This is a useful technique for maintaining regular backups of your database and storing them in a secure and accessible location. You will get an actual backup of Azure SQL Database stored in a storage account in .bacpac format, which you can restore or migrate as needed. The automation process involves creating an automation account that triggers a PowerShell script through a runbook to run the backup command and save the output to a blob container.

 

Prerequisites

  1. Azure Storage account: The storage account is needed to host the database backups. You are required to set up a container within this account to store the backups.

 

  1. Azure SQL Database and Server: This is the database you will back up, along with its hosting server. Ensure that you grant Azure services permission to access the server where this database is hosted by selecting "Allow Azure services and resources to access this server" within the networking section, as illustrated in the following documentation: Network Access Controls - Azure SQL Database & Azure Synapse Analytics | Microsoft Learn

 

  1. Azure Automation account and PowerShell Workflow runbook: These components are utilized to set up automatic backups and their scheduling. make sure to using a PowerShell version 7.2 or above when creating the runbook. To learn more about setting up a PowerShell Workflow runbook in Azure Automation, please check the following documentation: Tutorial - Create a PowerShell Workflow runbook in Azure Automation | Microsoft Learn

Setup

Once all the necessary prerequisites are in place, you should navigate to the runbook and click on "Edit." Then choose the "Edit in portal" option as illustrated below:

 

hqaffesha_0-1722506660955.png

 

 The editor interface for the runbook will launch, displaying the runbook you created earlier. Enter the following code into the editor area:

 

 

 

# Connect to Azure with system-assigned managed identity Connect-AzAccount -Identity # set and store context $AzureContext = Set-AzContext –SubscriptionId "*****" # Resource group name $resourceGroup = "*****" # Storage account name that will have the backups $storageAccountName = "*****" # Storage account access key that will have the backups $storageKey = "*****” # Container name that will have the backups $containerName = "*****" # storage blob uri with the datetime $storageUri = "https:// *****.blob.core.windows.net/*****/db-$(Get-Date -UFormat "%Y-%m-%d_%H-%m-%S").bacpac" #Storage access key type $storageKeyType = "StorageAccessKey" #SQL server name $server_name ="*****" #Database name to be exported $SQL_db = "*****" #SQL Auth Username $SQL_username = "*****" #SQL Auth Password $SQL_secure_secret = ConvertTo-SecureString -String "*****" -AsPlainText -Force # Run the Export job with the required parameters New-AzSqlDatabaseExport -ResourceGroupName $resourceGroup -ServerName $server_name -DatabaseName $SQL_db -StorageKeyType $storageKeyType -StorageKey $storageKey -StorageUri $storageUri -AdministratorLogin $SQL_username -AdministratorLoginPassword $SQL_secure_secret

 

 

 

 

Ensure you fill in all the ***** with the appropriate values for the step you've designed, then press the "Save" button located at the top left corner. The parameters given relate to details on subscription and resource group, as well as information on the SQL database, server, and storage account. You'll find explanations for each parameter in the code snippet provided above. 

 

If you have addition security requirement, you can store the secrets used in the script within an Azure Key Vault and retrieve them as needed within the PowerShell script. This approach ensures that sensitive information is securely managed and reduces the risk of exposure.

 

Select the test button on the toolbar, which will display the test pane allowing you to test the script prior to scheduling. Hit the start button and monitor the requests; if the run is successful, you should observe an outcome like this:

 

hqaffesha_0-1723372283914.png

 

 

 

You can monitor the progress of backups and access the export history through the SQL server in the "Import/Export history" section, as illustrated below:

 

hqaffesha_2-1722506735341.png

 

 

After the backup finishes, it will appear in the storage account with the datetime suffix provided by the script:

 

hqaffesha_3-1722506735343.png

 

 

After verifying the script operates correctly, you may go ahead with publishing the runbook and link it to a schedule based on your requirements according to the following documentation:

Manage schedules in Azure Automation | Microsoft Learn

 

You can initiate the automation by pressing the start button, schedule it to run automatically, or set a webhook to trigger the process. Remember that you can only run one export job at a time. If you attempt to run two jobs simultaneously, one will not succeed and you'll receive an error message stating that another export job is already in progress.

 

Recommendations

Establishing a routine backup schedule for a large database over an undefined timeframe can lead to substantial storage consumption and potentially significant costs. It's important to regularly monitor your storage account and remove unnecessary backups, or consider relocating them to more cost-effective storage tiers, such as cold or archive.

 

It might be beneficial to consider implementing a storage lifecycle management policy to manage data within the storage account and decrease the costs associated with storing database backups. Lifecycle management can help you in creating an automated schedule to delete blobs or transition blobs to a different tier that is less expensive like cold or archive tiers based on creation date. For additional details on storage lifecycle management and instructions for configuration, please consult the provided documentation:

Configure a lifecycle management policy - Azure Blob Storage | Microsoft Learn

 

If you have soft delete enabled on the storage account, make sure that the retention period is set appropriately to avoid incurring additional charges for retaining soft-deleted data over an extended period.

 

Published on:

Learn more
Azure PaaS Blog articles
Azure PaaS Blog articles

Azure PaaS Blog articles

Share post:

Related posts

Azure SDK Release (July 2025)

Azure SDK releases every month. In this post, you'll find this month's highlights and release notes. The post Azure SDK Release (July 2025) ap...

2 days ago

Announcing General Availability of UNISTR function and ANSI SQL || Operator in Azure SQL

We’re excited to announce the General Availability (GA) of two long-standing capabilities that address critical needs for SQL developers and e...

3 days ago

Power BI Cognitive Services and Azure Machine Learning features retiring; transition to Fabric AI services

Power BI is retiring Cognitive Services and Azure Machine Learning features by September 15, 2025, transitioning AI capabilities to Microsoft ...

3 days ago

Dynamics 365 Contact Center – Use Azure Communication Service-based SMS Mobile Numbers

How does this affect me? With this feature, Dynamics 365 Contact Center admins can now engage with customers in Australia, Belgium, Denmark, F...

3 days ago

Microsoft Entra: Action Required – Update Conditional Access Policies for Azure DevOps Sign-ins

Microsoft Entra requires updating Conditional Access policies by September 4, 2025, to explicitly include Azure DevOps (App ID: 499b84ac-1321-...

3 days ago

Azure Developer CLI (azd) – July 2025

This post announces the July release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) – July 2025 appeared first o...

4 days ago

From Manual Testing to AI-Generated Automation: Our Azure DevOps MCP + Playwright Success Story

In today’s fast-paced software development cycles, manual testing often becomes a significant bottleneck. Our team was facing a growing backlo...

8 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy