Leveraging DotNet for SQL Builds via YAML

  • Thread starter Thread starter j_folberth
  • Start date Start date
J

j_folberth

_c6eaaec7-3de9-4426-b336-b7e4a5cb1b1d.jpg

Introduction​


Welcome back and welcome to those who are new to this series of post on SQL Databases. In the previous one we discussed how to import an existing database into your local environment. For this section will talk specifically around building our database into a .dacpac file leveraging Azure DevOps Pipelines. This SQL .dacpac will then be used for deployments. Eventually we will get to leveraging YAML Pipeline Templates to achieve this with any database project.



Wait, Why?​


As with anything, by starting with the question of why we can better understand the point in this exercise. The goal here is to produce a .dacpac which can be deployed to any appropriate Azure environment. To achieve one must write the automated process to take a .sqlproj and build this into a reusable .dapac.



One important step that often gets overlooked here. This .dapac should be static! That means, if we re-run our deployment process it should re-deploy the same dacpac. This is an important concept to understand when discussing multi-stage deployments and potentially any rollbacks.



PreReqs​


We will carry over the same software requirements from the last post and add a few other details:

Version Control​


I want to take a moment and call this out. Yes, it is a perquisite; however, in my experience, this can be one of the biggest gaps when implementing any type of automated database deployment. Traditionally when we think of version control technologies like git or SVN we assume these are reserved for application developers. Technologies and roles have evolved, and this is no longer the case.



Data Engineers who are leveraging Databricks, Data Factory, or SQL Development should have the expectation that they are using a version control system. By doing so they can quickly collaborate, deploy code at any moment, and provide history across all changes that have been made which will also include the history and why the changes were made.



Build Steps​


When writing one of these I have found it can be helpful to write out the individual steps required for our build. In our case it will consist of:

  • Publishing a folder of SQL scripts for pipeline use (I used security as a placeholder, but these could be any pre/post scripts)
  • Get the appropriate version of the .NET SDK
  • Run a DotNetCore Build against the .sqlproj
  • Publish the .dapac file for pipeline use.

So that's 4 tasks! I took the liberty of color coding to show dependencies between the tasks. This will dictate that we should use two jobs to optimize our build processes.



Publish Scripts Folder


This job is really optional; however, it is my experience when deploying SQL Databases there is often a need to run a script on the server pre or post deployment. I am illustrating this as it is what I would consider a common ask and one that is typically required.





Code:
  jobs:
  - job: Publish_security
    steps:
    - task: PublishPipelineArtifact@1
      displayName: 'Publish Pipeline Artifact security '
      inputs:
        targetPath: security
        artifact: security
        properties: ''





Pretty basic right? We effectively need to pass in the name of the source folder and what we'd like the artifact to be called. We will be leveraging the PublishPipelineArtifact@1 task.



.dacpac Build Job


This job is really the bulk of our operations. As such it is a little bit longer but have no fear. We will walk through each step.





Code:
- job: build_publish_sql_sqlmoveme
    steps:
    - task: UseDotNet@2
      displayName: Use .NET SDK v3.1.x
      inputs:
        packageType: 'sdk'
        version: 3.1.x
        includePreviewVersions: true
    - task: DotNetCoreCLI@2
      displayName: dotnet build
      inputs:
        command: build
        projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj
        arguments: --configuration Release /p:NetCoreBuild=true
    - task: PublishPipelineArtifact@1
      displayName: 'Publish Pipeline Artifact sqlmoveme_dev_Release '
      inputs:
        targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/Release
        artifact: sqlmoveme_dev_Release
        properties: ''







Alright, this isn't too bad. These tasks are included in the same job as one cannot run without the other. i.e. We can't publish a file if it can't be built. So, let's dive into each one of these tasks:



UseDotNet@2

This one is another one that is considered optional; however, it allows us to control which version of .NET Core SDK we will use to build our project. If none is specified it will use the latest version on the build agent. I'd suggest putting this in there as it will allow finer grain control.



DotNetCoreCLI@2

This task will allow us to run a dotnet command on our build agent. This is important as it will take our .sqlproj and build it into a deployable .dacpac. As such we need to tell it a couple things:

  • command: build (the command we want to run)
  • projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj (location of our .sqlproj to build)
  • arguments: --configuration Release /p:NetCoreBuild=true --output sqlmoveme/Release (Any additional arguments required. For this one we are saying what project confirmation 'Release'. We are also describing this will be a 'NetCoreBuild'. This argument is now optional as it is now the default; however, older documentation may still show it as required. Also we are specifying the output directory for our built artifacts)

If wondering what is this $(Build.SourcesDirectory) argument is, it is a build in Azure DevOps Variable. As defined by the MS documentation it is "The local path on the agent where your source code files are downloaded." Put another way a build agent, where your code is running, downloads the repository to a local directly. This variable represents the local directory of your code on the build agent.



PublishPipelineArtifact@1
The last step in this job is to take the output from our DotNetCoreCLI@2 tasks which generated our .dacpac files and push copy them as a pipeline artifact. We need to provide the location relative to the $(Build.SourcesDirectory) that our NetCoreBuild outputted to as the source to publish. Note that I am including the project, environment, and release configuration name in the name of the artifact. This is to help with future scale as I wouldn't rule out a combination of having to run multiple builds for different environemnts against various configuration.



End Result​


We have a pipeline that publishes the following artifacts:

j_folberth_0-1724355841563.png

And here is the complete YAML Pipeline job definition:





Code:
trigger:
  none

pool:
  vmImage: 'windows-latest'
stages:
- stage: bicepaadentra_build
  jobs:
  - job: Publish_security
    steps:
    - task: PublishPipelineArtifact@1
      displayName: 'Publish Pipeline Artifact security '
      inputs:
        targetPath: security
        artifact: security
        properties: ''
  - job: build_publish_sql_sqlmoveme
    steps:
    - task: UseDotNet@2
      displayName: Use .NET SDK v3.1.x
      inputs:
        packageType: 'sdk'
        version: 3.1.x
        includePreviewVersions: true
    - task: DotNetCoreCLI@2
      displayName: dotnet build
      inputs:
        command: build
        projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj
        arguments: --configuration Release /p:NetCoreBuild=true
    - task: PublishPipelineArtifact@1
      displayName: 'Publish Pipeline Artifact sqlmoveme_dev_Release '
      inputs:
        targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/Release
        artifact: sqlmoveme_dev_Release
        properties: ''





All the source code for this can be found on a public repository



Next Steps​


Now that we have covered how to effectively build a .sqlproj into a .dacpac for deployments our next step will be to deploy this .dacpac to our SQL Server! Feel free to subscribe to this series on SQL Databases alternatively if you like my posts feel free to follow me.

Continue reading...
 
Back
Top