December 9th, 2020

Using an Azure Container Instance to convert a BAK to BACPAC for Import into Azure SQL Database

Davide Mauri
Principal Product Manager

Image pexels enrique hoyos 4041587

Guest Post

This post has been originally written by Jes Schultz (@grrl_geek), Microsoft Software Engineer, that created a very interesting solution to solve a very common challenge, and that we want to share with you all. Thanks Jes!

What problem are we trying to solve

Importing an existing SQL Server database into an Azure SQL Database is not a trivial task. You can only import a BACPAC file you can’t attach a database or restore a backup. In some cases, you may not have direct access to the database to create a BACPAC, but you have the database .mdf or a .bak backup file available. In this solution, you will see how to take a .bak file that is in an Azure File Share, attach that File Share to an Azure Container Instance running SQL Server, restore the .bak, create the .bacpac, copy it to Azure Blob Storage, and then import it into an Azure SQL Database.

Architecture choices

This solution is based entirely on Azure PaaS services. The conversion of the file could be achieved by creating a SQL Server VM in Azure and running the processes, but the start-up time is longer and the maintenance is unwanted. The Azure resources used for this solution are:

The code

The full code to build this solution is in the Using an Azure Container Instance to convert BAK to BACPAC for Import into Azure SQL Database repository.

Building the container

You will want to have Docker Desktop installed if you are running Windows or Mac. If running Windows, you’ll also need Windows Subsystem for Linux (WSL).

The Docker image is built locally first. The Dockerfile will perform the following steps:

  • Pull the latest SQL Server image from the Microsoft repo.
  • Install the SQL Server tools, which will be needed for sqlcmd.exe.
  • Copy and unzip the sqlpackage.exe tool.
  • Copy stored procedures to restore the database.
  • Create a mount point folder that will point to the Azure File Share.
  • Set permissions for the SQL user to run the script.
FROM mcr.microsoft.com/mssql/server

# Switch to root user for access to apt-get install
USER root

# Install curl
RUN apt-get -y update
RUN apt-get install -y curl gnupg unzip

#Add the mssql-tools repository
RUN curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
RUN curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | tee /etc/apt/sources.list.d/msprod.list

#Install mssql-tools
RUN apt-get -y update
RUN apt-get install -y mssql-tools unixodbc-dev

#Copy sqlpackage utility and other required file to root of container
COPY . /
RUN chmod 777 *.sql
RUN chown -R mssql:root *.sql

RUN unzip /sqlpackage-linux-x64-en-US-15.0.4826.1.zip -d /sqlpackage
RUN chmod 777 -R /sqlpackage
RUN rm /sqlpackage-linux-x64-en-US-15.0.4826.1.zip

#Create mount point folders
RUN mkdir /mnt/external
RUN chown -R mssql:root /mnt

#Set permissions on script file
RUN chmod a+x ./create-bacpacs.sh

# Switch back to mssql user and run the entrypoint script
USER mssql
ENTRYPOINT /bin/bash ./entrypoint.sh

The entrypoint.sh script will tell the container where to execute code from.

/opt/mssql/bin/sqlservr & ./create-bacpacs.sh

The create-bacpacs.sh script does the work inside the SQL Server instance. In the script, it will:

  • Wait 20 seconds, to give the SQL Server service time to start.
  • Create the stored procedures.
  • Loop through the file share to find .bak files.
  • Run sqlcmd to restore the .bak and create a new database.
  • Run sqlpackage to export the database as a .bacpac.
#sleep for 20 seconds and let SQL Server start all the way
sleep 20

/opt/mssql-tools/bin/sqlcmd -l 300 -S localhost -U sa -P $SA_PASSWORD -d master -i "/create_procedure_restoreheaderonly.sql"
/opt/mssql-tools/bin/sqlcmd -l 300 -S localhost -U sa -P $SA_PASSWORD -d master -i "/create_procedure_restoredatabase.sql"

#run the setup script to create the DB and the schema in the DB
#do this in a loop because the timing for when the SQL instance is ready is indeterminate
for f in /mnt/external/*.bak;
do
    s=${f##*/}
    name="${s%.*}"
    extension="${s#*.}"
    echo "Restoring $f..."
    /opt/mssql-tools/bin/sqlcmd -l 300 -S localhost -U sa -P $SA_PASSWORD -d master -q "EXEC dbo.restoredatabase '/mnt/external/$name.$extension', '$name'"
    echo "Creating bacpac..."
    /sqlpackage/sqlpackage -a:"Export" -ssn:"localhost" -su:"sa" -sp:"$SA_PASSWORD" -sdn:"$name" -tf:"/mnt/external/$name.bacpac"
done

The next step is to create the Docker image locally. Make sure you are in the Docker directory to run this script.

# Create Docker image locally from Dockerfile

docker build -t mssql-bak-bacpac .

The last step to test locally is to create a container based on the image. Create a directory, C:/Docker/bak. Put a sample .bak in the directory. The SQL Server sa password must be entered here. Make sure this password is secured in a secret management store (such as Azure Key Vault). You will need to use this sa password when you create containers based on the image in Azure later.

# An SA_PASSWORD must be enetered here 
docker run -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=" `
-v C:/Docker/bak:/mnt/external `
--name bak-to-bacpac mssql-bak-bacpac

Deploying Azure resources

The next step is to set up the Azure infrastructure. You will need the following resources. The Github repo contains scripts to create these.

  • Resource Group
  • Key Vault
  • Storage function
  • Storage File Share and Blob Container
  • Container Registry
  • SQL server

Creating the container in Azure

With the infrastructure created, the local image can now be pushed to the Container Registry.

$RGName = 'sqlcontainers' 
$ACRName = 'acrsqlcontainers'
$ACRPath = 'sql/bak-bacpac:latest'

# Log in to registry 
$ACRNameObj = Get-AzContainerRegistry -ResourceGroupName $RGName -Name $ACRName
$ACRCred = Get-AzContainerRegistryCredential -Registry $ACRNameObj

# Call docker login, passing in password 
$ACRCred.Password | docker login $ACRNameObj.LoginServer --username $ACRCred.Username --password-stdin

# Tag image 
$ImagePath = $ACRNameObj.LoginServer + '/' + $ACRPath
docker tag mssql-bak-bacpac $ImagePath

# Push image to repository 
docker push $ImagePath

Now the image that is stored in the Container Registry can be used to create a new Container Instance. This cmdlet, when run, will create the container and execute the script inside it.

This is where you will need to enter the sa password you used in your Docker image.

$SA_PASSWORD = Read-Host -Prompt "Please enter the SA password:"
$RGName = 'sqlcontainers'
$KVName = 'kvsqlcontainers'
$ContainerGroupName = 'aci-sql-bak-bacpac'
$ACRName = 'acrsqlcontainers'
$ACRLoginServer = (Get-AzContainerRegistry -ResourceGroupName $RGName -Name $ACRName).LoginServer
$ACRUser = (Get-AzKeyVaultSecret -VaultName $KVName  -Name 'acr-pull-user').SecretValueText
$ACRPass = (Get-AzKeyVaultSecret -VaultName $KVName -Name 'acr-pull-pass').SecretValue
$ACRCred = New-Object System.Management.Automation.PSCredential ($ACRUser, $ACRPass)
$ACRPath = 'sql/bak-bacpac:latest'
$EnvVariables = @{ ACCEPT_EULA="Y"; SA_PASSWORD=$SA_PASSWORD; MSSQL_PID="Enterprise";}
$StorageAcctName = 'customersqlbaks'
$StorageAcctKey = (Get-AzStorageAccountKey -ResourceGroupName $RGName -Name $StorageAcctName)[0].Value | ConvertTo-SecureString -AsPlainText -Force
$StorageAcctCred = New-Object System.Management.Automation.PSCredential($StorageAcctName, $StorageAcctKey)
$StorageAcctFileShareName = 'baks'
$VolumeMountPath  = '/mnt/external'

$CGExists = Get-AzContainerGroup -ResourceGroupName $RGName -Name $ContainerGroupName -ErrorAction SilentlyContinue
if ($CGExists -eq $null)
    {
        New-AzContainerGroup `
            -Name $ContainerGroupName `
            -ResourceGroupName $RGName `
            -Image $ACRLoginServer/$ACRPath  `
            -RegistryServerDomain $ACRLoginServer `
            -RegistryCredential $ACRCred `
            -DnsNameLabel $ContainerGroupName `
            -IpAddressType Public `
            -EnvironmentVariable $EnvVariables `
            -AzureFileVolumeAccountCredential $StorageAcctCred `
            -AzureFileVolumeShareName $StorageAcctFileShareName `
            -AzureFileVolumeMountPath $VolumeMountPath `
            -OsType Linux `
            -Cpu 2 `
            -MemoryInGB 4 

        Write-Host "Container group ($ContainerGroupName) created."
    }
else 
    {
        Write-Host "Container group ($ContainerGroupName) exists."
    }

You can verify that your Azure File Share now contains a .bak and a .bacpac.

Importing the database

Once the .bacpac is in your file share, all that remains is to copy the file to the blob container, and import the database.

$RGName = 'sqlcontainers'
$KVName = 'kvsqlcontainers'
$StorageAcctName = 'customersqlbaks'
$StorageAcctKey = (Get-AzStorageAccountKey -ResourceGroupName $RGName -Name $StorageAcctName)[0].Value 
$StorageAcctFileShareName = 'baks'
$StorageContext = (Get-AzStorageAccount -ResourceGroupName $RGName -Name $StorageAcctName).Context
$StorageFileShareObj = Get-AzStorageFile -ShareName $StorageAcctFileShareName -Context $StorageContext
$Filtered = $StorageFileShareObj | Where-Object {$_.name -like '*.bacpac'}
$FileName = $Filtered.Name
$SASToken = New-AzStorageAccountSASToken -Service Blob,File,Table,Queue -ResourceType Service,Container,Object -Permission "racwdlup" -Context $StorageContext
$StorageUriFileShareSAS = "https://$StorageAcctName.file.core.windows.net/$StorageAcctFileShareName/$FileName$SASToken"
$StorageUriBlob = "https://$StorageAcctName.blob.core.windows.net/$StorageAcctFileShareName/$FileName"
$StorageUriBlobSAS = "https://$StorageAcctName.blob.core.windows.net/$StorageAcctFileShareName/$FileName$SASToken"
$SqlServerName = 'customerdbsfrombak'
$SqlAdminUser = (Get-AzSqlServer -ResourceGroup $RGName -Name $SqlServerName).SqlAdministratorLogin
$SqlAdminPass =  (Get-AzKeyVaultSecret -VaultName $KVName -Name "$SqlServerName-admin").SecretValue 
$SQLDB = 'importedbak'
$sqlEdition = 'BusinessCritical'
$sqlSLO = 'BC_Gen5_2'

#Move file using azcopy 
azcopy login
azcopy copy $StorageUriFileShareSAS $StorageUriBlobSAS

Write-Output "Importing bacpac..."
$importRequest = New-AzSqlDatabaseImport `
    -DatabaseName $SQLDB `
    -Edition $sqlEdition `
    -ServiceObjectiveName $sqlSLO `
    -DatabaseMaxSizeBytes "$(10 * 1024 * 1024 * 1024)" `
    -ServerName $SqlServerName `
    -StorageKeyType 'StorageAccessKey' `
    -StorageKey $StorageAcctKey `
    -StorageUri $StorageUriBlob `
    -AdministratorLogin $SqlAdminUser `
    -AdministratorLoginPassword $SqlAdminPass `
    -ResourceGroupName $RGName 
do {
    $importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
    Start-Sleep -s 10
} while ($importStatus.Status -eq "InProgress")

You can now go to your Azure SQL blade in the portal and see your database.

This solution helps solve the problem of not being able to restore a .bak file into an Azure SQL Database, but without the overhead of creating and maintaining a virtual machine. The code can be downloaded from Using an Azure Container Instance to convert BAK to BACPAC for Import into Azure SQL Database.


Photo by Enrique Hoyos from Pexels

Author

Davide Mauri
Principal Product Manager

I started as a developer, I fell in love with Data and Database, in all their forms. I still have a passion for development (C# and Python). My focus has been databases and performance tuning, focusing both on transactional and analytical workloads. For 5 years I helped developers to get the best out of SQL Server, then I moved to Business Intelligence and Data Warehousing for 10 years. Then I moved to IoT and Big Data for a while. Now back to database space, as Product Manager for Azure SQL ...

More about author

0 comments

Discussion are closed.