H
hqaffesha
In this article, we are going to provide detailed steps to create a scheduled Azure SQL Database backup to storage account using automation. This is a useful technique for maintaining regular backups of your database and storing them in a secure and accessible location. You will get an actual backup of Azure SQL Database stored in a storage account in .bacpac format, which you can restore or migrate as needed. The automation process involves creating an automation account that triggers a PowerShell script through a runbook to run the backup command and save the output to a blob container.
Once all the necessary prerequisites are in place, you should navigate to the runbook and click on "Edit." Then choose the "Edit in portal" option as illustrated below:
The editor interface for the runbook will launch, displaying the runbook you created earlier. Enter the following code into the editor area:
Ensure you fill in all the ***** with the appropriate values for the step you've designed, then press the "Save" button located at the top left corner. The parameters given relate to details on subscription and resource group, as well as information on the SQL database, server, and storage account. You'll find explanations for each parameter in the code snippet provided above.
If you have addition security requirement, you can store the secrets used in the script within an Azure Key Vault and retrieve them as needed within the PowerShell script. This approach ensures that sensitive information is securely managed and reduces the risk of exposure.
Select the test button on the toolbar, which will display the test pane allowing you to test the script prior to scheduling. Hit the start button and monitor the requests; if the run is successful, you should observe an outcome like this:
You can monitor the progress of backups and access the export history through the SQL server in the "Import/Export history" section, as illustrated below:
After the backup finishes, it will appear in the storage account with the datetime suffix provided by the script:
After verifying the script operates correctly, you may go ahead with publishing the runbook and link it to a schedule based on your requirements according to the following documentation:
Manage schedules in Azure Automation | Microsoft Learn
You can initiate the automation by pressing the start button, schedule it to run automatically, or set a webhook to trigger the process. Remember that you can only run one export job at a time. If you attempt to run two jobs simultaneously, one will not succeed and you'll receive an error message stating that another export job is already in progress.
Establishing a routine backup schedule for a large database over an undefined timeframe can lead to substantial storage consumption and potentially significant costs. It's important to regularly monitor your storage account and remove unnecessary backups, or consider relocating them to more cost-effective storage tiers, such as cold or archive.
It might be beneficial to consider implementing a storage lifecycle management policy to manage data within the storage account and decrease the costs associated with storing database backups. Lifecycle management can help you in creating an automated schedule to delete blobs or transition blobs to a different tier that is less expensive like cold or archive tiers based on creation date. For additional details on storage lifecycle management and instructions for configuration, please consult the provided documentation:
Configure a lifecycle management policy - Azure Blob Storage | Microsoft Learn
If you have soft delete enabled on the storage account, make sure that the retention period is set appropriately to avoid incurring additional charges for retaining soft-deleted data over an extended period.
Continue reading...
Prerequisites
- Azure Storage account: The storage account is needed to host the database backups. You are required to set up a container within this account to store the backups.
- Azure SQL Database and Server: This is the database you will back up, along with its hosting server. Ensure that you grant Azure services permission to access the server where this database is hosted by selecting "Allow Azure services and resources to access this server" within the networking section, as illustrated in the following documentation: Network Access Controls - Azure SQL Database & Azure Synapse Analytics | Microsoft Learn
- Azure Automation account and PowerShell Workflow runbook: These components are utilized to set up automatic backups and their scheduling. make sure to using a PowerShell version 7.2 or above when creating the runbook. To learn more about setting up a PowerShell Workflow runbook in Azure Automation, please check the following documentation: Tutorial - Create a PowerShell Workflow runbook in Azure Automation | Microsoft Learn
Setup
Once all the necessary prerequisites are in place, you should navigate to the runbook and click on "Edit." Then choose the "Edit in portal" option as illustrated below:
The editor interface for the runbook will launch, displaying the runbook you created earlier. Enter the following code into the editor area:
Code:
# Connect to Azure with system-assigned managed identity
Connect-AzAccount -Identity
# set and store context
$AzureContext = Set-AzContext –SubscriptionId "*****"
# Resource group name
$resourceGroup = "*****"
# Storage account name that will have the backups
$storageAccountName = "*****"
# Storage account access key that will have the backups
$storageKey = "*****”
# Container name that will have the backups
$containerName = "*****"
# storage blob uri with the datetime
$storageUri = "https:// *****.blob.core.windows.net/*****/db-$(Get-Date -UFormat "%Y-%m-%d_%H-%m-%S").bacpac"
#Storage access key type
$storageKeyType = "StorageAccessKey"
#SQL server name
$server_name ="*****"
#Database name to be exported
$SQL_db = "*****"
#SQL Auth Username
$SQL_username = "*****"
#SQL Auth Password
$SQL_secure_secret = ConvertTo-SecureString -String "*****" -AsPlainText -Force
# Run the Export job with the required parameters
New-AzSqlDatabaseExport -ResourceGroupName $resourceGroup -ServerName $server_name -DatabaseName $SQL_db -StorageKeyType $storageKeyType -StorageKey $storageKey -StorageUri $storageUri -AdministratorLogin $SQL_username -AdministratorLoginPassword $SQL_secure_secret
Ensure you fill in all the ***** with the appropriate values for the step you've designed, then press the "Save" button located at the top left corner. The parameters given relate to details on subscription and resource group, as well as information on the SQL database, server, and storage account. You'll find explanations for each parameter in the code snippet provided above.
If you have addition security requirement, you can store the secrets used in the script within an Azure Key Vault and retrieve them as needed within the PowerShell script. This approach ensures that sensitive information is securely managed and reduces the risk of exposure.
Select the test button on the toolbar, which will display the test pane allowing you to test the script prior to scheduling. Hit the start button and monitor the requests; if the run is successful, you should observe an outcome like this:
You can monitor the progress of backups and access the export history through the SQL server in the "Import/Export history" section, as illustrated below:
After the backup finishes, it will appear in the storage account with the datetime suffix provided by the script:
After verifying the script operates correctly, you may go ahead with publishing the runbook and link it to a schedule based on your requirements according to the following documentation:
Manage schedules in Azure Automation | Microsoft Learn
You can initiate the automation by pressing the start button, schedule it to run automatically, or set a webhook to trigger the process. Remember that you can only run one export job at a time. If you attempt to run two jobs simultaneously, one will not succeed and you'll receive an error message stating that another export job is already in progress.
Recommendations
Establishing a routine backup schedule for a large database over an undefined timeframe can lead to substantial storage consumption and potentially significant costs. It's important to regularly monitor your storage account and remove unnecessary backups, or consider relocating them to more cost-effective storage tiers, such as cold or archive.
It might be beneficial to consider implementing a storage lifecycle management policy to manage data within the storage account and decrease the costs associated with storing database backups. Lifecycle management can help you in creating an automated schedule to delete blobs or transition blobs to a different tier that is less expensive like cold or archive tiers based on creation date. For additional details on storage lifecycle management and instructions for configuration, please consult the provided documentation:
Configure a lifecycle management policy - Azure Blob Storage | Microsoft Learn
If you have soft delete enabled on the storage account, make sure that the retention period is set appropriately to avoid incurring additional charges for retaining soft-deleted data over an extended period.
Continue reading...