Cross Subscription Database Restore for SQL Managed Instance Database with TDE enabled using ADF

  • Thread starter Thread starter MUA
  • Start date Start date
M

MUA

Our customers require daily refreshes of their production database to the non-production environment. The database, approximately 600GB in size, has Transparent Data Encryption (TDE) enabled in production. Disabling TDE before performing a copy-only backup is not an option, as it would take hours to disable and re-enable. To meet customer needs, we use a customer-managed key stored in Key Vault. Azure Data Factory is then utilized to schedule and execute the end-to-end database restore process.



Permission requires

  • To perform backup and restore operations, the SQL Managed Instance Managed Identity needs to have the "Contributor, Storage Blob Data Contributor" permission for the blob storage.
  • To transfer backup files between two storage locations, ADF managed identity needs the "Storage Blob Data Contributor" permission for the blob storage.
  • To carry out backup and restore operations, ADF managed identity needs 'sysadmin' permissions on SQL Managed Instance.
  • To enable Transparent Data Encryption (TDE) with a customer-managed key, ensure you have Contributor access to the Resource Group (RG) where the SQL managed instance resides.
  • Establish an Azure Key vault access policy for the SQL managed instance's managed identity and the user configuring Transparent Data Encryption (TDE), granting them full key permissions.

Step 1

Create a TDE key in the non-production Azure Key Vault dev-kv-001 within the same subscription as the non-production SQL Managed Instance.

Name the key, select RSA with a 2048-bit key size, and leave the active and expiration dates unset for this demonstration. Ensure the key is enabled, and do not set a rotation policy. Finally, click Create.



MUA_0-1729221048308.png





MUA_1-1729221048311.png





Step 2

Establish an Azure Key vault access policy for the SQL managed instance's managed identity and the user configuring Transparent Data Encryption (TDE), granting them full key permissions.



MUA_2-1729221048317.png



Step 3

Backup the TDE key we just created with in non-prod key vault

MUA_3-1729221048320.png



Step 4

Create a new Key Vault dev-kv-002 and proceed to restore the key within this newly created vault. Ensure the name matches the backed-up key name and that the status is set to enabled.



Step 5

Move the new dev-kv-002 Azure Key Vault from the development (non-prod) subscription to the production subscription. This process may take a few minutes as it validates the ability to move the Key vault.



MUA_4-1729221048327.png



Step 6

Having successfully moved our Key Vault dev-kv-002 to the production subscription, we will now proceed to backup (Follow step 3) the Key for restoration in the actual production Key vault prod-kv-001

MUA_5-1729221048331.png



Step 7

We are now prepared to link the keys we created with their respective Azure SQL Managed Instances in both the development and production environments. The objective is to maintain the same key in the backup, enabling us to refresh our production environment into development seamlessly.



We will simultaneously execute these actions in both our Production and Development SQL Managed Instances. Begin by accessing your portal blades for the SQL Managed Instances. Once there, navigate to the SQL Managed Instance blade and select Transparent Data Encryption under the Security section.



To enable a successful production refresh of our development environment, we need to switch from Service-managed keys to Customer-managed keys.



MUA_6-1729221048340.png



Step: 8

Creates a server-level credential. A credential is a record that contains the authentication information that is required to connect to a resource outside SQL Server.



Code:
USE master
CREATE CREDENTIAL [https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>]
    WITH IDENTITY='Managed Identity'
GO





Step: 9

  • Create ADF link service connects for both SQL Managed Instances and storage accounts.
  • Create ADF dataset using both SQL Managed Instances and storage accounts link services

Step: 10

If you're utilizing a private endpoint, make sure to set up an ADF integration runtime and a managed link follow Create Azure Data Factory Managed Private Links



Step: 11

Create ADF pipeline to take database backup from source.

  • Split backup files into multiple files for faster backup
  • Use below scripts to take copy only database backup
  • Use Script activity to execute the backup scripts using source SQL MI link service



Code:
BACKUP DATABASE [@{pipeline().parameters.source_database_name}]
  TO URL = N'https://<storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_01.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_02.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_03.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_04.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_05.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_06.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_07.bak',
  URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_08.bak'
WITH COPY_ONLY, MAXTRANSFERSIZE = 4194304, COMPRESSION, STATS = 10





MUA_8-1729221048348.png



Allow a minute for the backup to transfer to blob storage, adjusting the duration to meet your specific needs.

MUA_9-1729221048350.png



Step: 12

Create ADF pipeline to copy database backup files from source storage account to target storage account.

  • Use copy activity to copy backup files from source storage account to target storage account.
  • Allow a minute for the backup to transfer to blob storage, adjusting the duration to meet your specific needs.

MUA_10-1729221048351.png



Step: 13

Create Azure Data Factory pipeline to restore database to a target SQL Managed Instance backup from the designated storage account.

  • Use below scripts to restore database from designated storage account
  • Use Script activity to execute the restore scripts using target SQL MI link service



Code:
USE [master]

RESTORE DATABASE [@{pipeline().parameters.target_database_name}] FROM  
URL = N'https://<storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_01.bak',
URL = N'https://<storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_02.bak',
URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_03.bak',
URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_04.bak',
URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_05.bak',
URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_06.bak',
URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_07.bak',
URL = N'https:// <storageaccountname>.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_08.bak'





MUA_11-1729221048357.png



Step: 14

Set up an additional pipeline to remove orphan databases users, provide user access, or carry out any extra tasks needed after a restore, using the suitable activity task.



Step: 15

Create ADF pipeline workstation to execute all Step4 > Step5>Step6>Step7 in sequence.

  • Set up parameters for both the source_database_name and target_database_name to enable dynamic operation of the pipeline across different databases.

MUA_13-1729221048364.png

Continue reading...
 
Back
Top