Guest RuiCunha Posted October 18, 2022 Posted October 18, 2022 Introduction When implementing CICD processes in the context of Azure Synapse Analytics, you will require different workflows, depending on whether you are automating the integration and delivery of Workspace artifacts (pipelines,notebooks,etc…) or SQL pool objects (tables,stored procedures,etc…). When you want to propagate all the changes that were made to your SQL pool objects across multiple environments, you typically build a DAC package called DACPAC and use the Azure SQL Data Warehouse Deployment task , in Azure DevOps, to deploy those changes. In an ever-evolving cloud environment, an implementation that was once stable can now start to generate surprising errors like the one below, which can be quite challenging to troubleshoot. The goal of this article is to explain how you can take advantage of diagnostics logs files when troubleshooting DACPAC deployments against a Synapse Dedicated SQL Pool. These logs also represent a valuable input that you can share with Microsoft's engineering team when submitting a support request. A bit of context When using the Azure SQL Data Warehouse Deployment task to deploy (or publish) a DACPAC file, you are interfacing with the Data-Tier Application Framework (DacFX) through a command-line utility called SqlPackage. This utility exposes some of the DacFx APIs (like Extract or Publish) allowing you to specify several actions along with action-specific parameters and properties. To deploy a DACPAC you need to use this specific Publish action, that will invoke the Publish API. When publishing a Dacpac, you will get some verbose resulting from the Azure DevOps task logs. In some circumstances, you will find the information from the logs insufficient to troubleshoot the error that you are facing. You can improve the logging verbose by leveraging the diagnostic capabilities of this Publish action, by adding some parameters to the SqlPackage command to generate the diagnostics logs. Keep in mind that the goal of the Publish action is to Incrementally update database schema(s) to match the schema(s) of a source .dacpac file. If the SQL pool does not exist on the target server, the publish operation creates it. Otherwise, an existing database is updated. In the context of Azure Synapse Analytics, the database will be automatically created only if you are using a Formerly SQL DW pool. If you are using a Synapse Workspace SQL pool, you will need to create your target SQL pool from the Portal or programmatically before deploying the Dacpac. Note: at the time of writing, only dedicated SQL pools are supported by SSDT (SqlPackage). Generating the Diagnostic Logs file Logs are essential to troubleshooting. To help you troubleshoot any error resulting from Dacpac deployments, you can generate the diagnostics log file, adding more verbose to the logging. To achieve that, you need to add the /DiagnosticsFile (or /df) as an additional argument for the Publish Action in SqlPackage, and specify the diagnostics file location: Getting this diagnostics file can be a bit more challenging if you run your code using a Microsoft-Hosted agent. On the contrary, when running in your own compute infrastructure (using a Self-Hosted agent), you just need file access permissions to the file location specified in the DiagnosticsFile parameter ( System.DefaultWorkingDirectory in the above example). Below you can find an example, using a Microsoft Hosted agent to deploy a DACPAC, where we are deploying a Dacpac, generating the Diagnostics file and taking advantage of the Publish Pipeline Artifacts task to publish (upload) the diagnostics file as a named artifact for the current pipeline run. Here's the YAML code: pool: vmImage: windows-latest steps: - task: SqlAzureDacpacDeployment@1 inputs: azureSubscription: 'mysubscription' AuthenticationType: 'server' ServerName: 'myserver.sql.azuresynapse.net' DatabaseName: 'dacpactargetpool' SqlUsername: 'sqladmin' SqlPassword: 'xxxxxxxxxxxx' deployType: 'DacpacTask' DeploymentAction: 'Publish' DacpacFile: '$(System.DefaultWorkingDirectory)/Synapse_Project.dacpac' AdditionalArguments: '/DiagnosticsFile:$(System.DefaultWorkingDirectory)/output.log' IpDetectionMethod: 'AutoDetect' - task: PublishPipelineArtifact@1 inputs: targetPath: '$(System.DefaultWorkingDirectory)/output.log' artifact: 'Diagnostic File' publishLocation: 'pipeline' After saving your code, you just need to run your pipeline. You can also enable the debug system variable to add more verbose to your log file When the pipeline execution is finished, you will see 1 published artifact: After that, you can download the output.log from your branch: Note: Published artifacts can be downloaded in other jobs, stages, or pipelines using the "Download pipeline artifact" task. Tips when troubleshooting a failed DACPAC deployment Check if you are running the latest version of SqlPackage New SqlPackage builds will ship new improvements and bug fixes. It's important that you run your code using the latest build. You can use the Diagnostics File to validate which SqlPackage build is being used to deploy your DACPAC. You can find this information in the first few lines of the Diagnostics File: Note: Make sure you are running the latest SqlPackage build by downloading and installing from here: Download and install SqlPackage - SQL Server | Microsoft Learn Check the database Permissions for the user that is deploying the DACPAC Make sure that the user that is deploying the DACPAC was granted the necessary roles both at server level (master) and database level (the target SQL pool). As the underlying architecture and security model in Azure Synapse Analytics may differ from the traditional SQL server, you may need to adapt the logic of granting permissions to your dapac deployment user to conform with Azure Synapse Analytics security model. Here are a couple of error messages that you can find in the Diagnostics file resulting from insufficient user permissions: Error Message: --*** An error occurred during deployment plan generation. Deployment cannot continue. --Could not connect to servername.sql.azuresynapse.net mydatabase: The server principal "dacpacuser" is not able to access the database "master" under the current security context. --Cannot open user default database. Login failed. --*** An error occurred during deployment plan generation. Deployment cannot continue. --Failed to import target model mydatabase. Detailed message The reverse engineering operation cannot continue because you do not have View Definition permission on the 'mydatabase' database. --The reverse engineering operation cannot continue because you do not have View Definition permission on the 'mydatabase' database. Here's an example about creating a SQL user and granting the necessary roles (both at master and sql pool level) to deploy the DACPAC against a Synapse dedicated SQL pool: --execute in master CREATE LOGIN dacpacuser WITH PASSWORD = 'Password!123'; CREATE USER dacpacuser FOR LOGIN dacpacuser; EXEC sp_addrolemember 'dbmanager', 'dacpacuser'; --execute in the sqlpool CREATE USER dacpacuser FOR LOGIN dacpacuser; EXEC sp_addrolemember 'db_owner', 'dacpacuser'; Reproduce the issue from the command line You can also use the SqlPackage from the command line to help you troubleshooting the deployment issue. First you need to make sure that you are running the SqlPackage using the latest build. From the root folder of your local drive, you can search for the executable file (sqlpackage.exe) and find the right location from where you should run this command (at the time of writing the 160 is the latest build). To help you troubleshooting the a failed DACPAC deployment from the commmand line, consider executing the following commands: Deploying the DACPAC to a new database SqlPackage /Action:Publish /SourceFile:"mydacpacname.dacpac" /TargetServerName:"mypool.sql.azuresynapse.net" /TargetDatabaseName:"newpool" /TargetUser:"dacpacuser" /TargetPassword:"mypass" /TargetTimeout:120 /d:True /df:"diag_file.txt" /dsp:"deploy_script.txt" Important Note: If newpool does not exist and you are using a Synapse Workspace SQL pool, this operation will fail. You need to create the "newpool" prior to executing this Publish. This will not be necessary when deploying a Formerly SQL DW pool, the publish operation will be able to create the new database Once the first publishing is complete, repeat the publishing against the same database and see if the issue persists. If you are not getting any error when deploying your Dacpac to a new database, this may suggest that the issue may be in your target SQL pool. Extracting the DACPAC from the target database To verify if there is any issue when SqlPackage is building the SQL model from your target database, you can use the EXTRACT action to build a DACPAC from your target database. Note: make sure your dedicated SQL pool is online as the EXTRACT will fail if it is paused. SqlPackage /Action:Extract /TargetFile:mytargetdb.dacpac /p:ExtractAllTableData=false /p:VerifyExtraction=true /SourceServerName:"myserver.sql.azuresynapse.net" /SourceDatabaseName:"mytargetpool" /SourceUser:"dacpacuser" /SourcePassword:"mypass" /df:"diag_file.txt" By using the EXTRACT action, you can quickly identify if there are any inconsistencies in your target SQL pool that are causing your DACPAC deployment to fail. Note: This logic may also be important to apply to your source database to rule out any inconsistency in your source database as the SOURCE dacpac may also host these inconsistencies as well. Conclusion In some scenarios, when failing to deploy a Dacpac file against a Synapse dedicated SQL pool, the error information available in the logs may be insufficient for a quick troubleshooting and resolution. In these cases, we can take advantage of creating diagnostic files to generate more information about the error to make troubleshooting easier. Diagnostic files are also an essential tool for Microsoft engineers to investigate the problem, it will be something that will speed up the investigation of the problem. Continue reading... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.