How to Read an Sql File Sqlcmd.exe
Using an Azure Container Example to convert a BAK to BACPAC for Import into Azure SQL Database
Davide
Guest Post
This postal service has been originally written by Jes Schultz (@grrl_geek), Microsoft Software Engineer, that created a very interesting solution to solve a very common challenge, and that we want to share with y'all all. Thanks Jes!
What problem are we trying to solve
Importing an existing SQL Server database into an Azure SQL Database is not a picayune task. Yous can only import a BACPAC file you tin't attach a database or restore a backup. In some cases, y'all may not accept directly admission to the database to create a BACPAC, only you lot accept the database .mdf or a .bak backup file bachelor. In this solution, yous will see how to take a .bak file that is in an Azure File Share, attach that File Share to an Azure Container Instance running SQL Server, restore the .bak, create the .bacpac, copy it to Azure Blob Storage, and and then import it into an Azure SQL Database.
Architecture choices
This solution is based entirely on Azure PaaS services. The conversion of the file could exist accomplished by creating a SQL Server VM in Azure and running the processes, but the start-upwards time is longer and the maintenance is unwanted. The Azure resource used for this solution are:
- Storage – File Share
- Storage – Hulk
- Fundamental Vault
- Container Registry
- Container Instances
- SQL Database
The code
The full code to build this solution is in the Using an Azure Container Case to convert BAK to BACPAC for Import into Azure SQL Database repository.
Building the container
Y'all volition want to have Docker Desktop installed if y'all are running Windows or Mac. If running Windows, you'll besides need Windows Subsystem for Linux (WSL).
The Docker image is built locally outset. The Dockerfile will perform the following steps:
- Pull the latest SQL Server image from the Microsoft repo.
- Install the SQL Server tools, which volition be needed for sqlcmd.exe.
- Copy and unzip the sqlpackage.exe tool.
- Re-create stored procedures to restore the database.
- Create a mount point binder that will signal to the Azure File Share.
- Set up permissions for the SQL user to run the script.
FROM mcr.microsoft.com/mssql/server # Switch to root user for access to apt-get install USER root # Install whorl RUN apt-get -y update RUN apt-get install -y curl gnupg unzip #Add together the mssql-tools repository RUN curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add together - RUN curl https://packages.microsoft.com/config/ubuntu/sixteen.04/prod.list | tee /etc/apt/sources.list.d/msprod.list #Install mssql-tools RUN apt-go -y update RUN apt-become install -y mssql-tools unixodbc-dev #Copy sqlpackage utility and other required file to root of container Copy . / RUN chmod 777 *.sql RUN chown -R mssql:root *.sql RUN unzip /sqlpackage-linux-x64-en-United states-15.0.4826.1.nil -d /sqlpackage RUN chmod 777 -R /sqlpackage RUN rm /sqlpackage-linux-x64-en-US-15.0.4826.one.nada #Create mount point folders RUN mkdir /mnt/external RUN chown -R mssql:root /mnt #Set permissions on script file RUN chmod a+x ./create-bacpacs.sh # Switch back to mssql user and run the entrypoint script USER mssql ENTRYPOINT /bin/bash ./entrypoint.sh
The entrypoint.sh script will tell the container where to execute code from.
/opt/mssql/bin/sqlservr & ./create-bacpacs.sh
The create-bacpacs.sh script does the work inside the SQL Server instance. In the script, it volition:
- Expect 20 seconds, to give the SQL Server service time to starting time.
- Create the stored procedures.
- Loop through the file share to find .bak files.
- Run sqlcmd to restore the .bak and create a new database.
- Run sqlpackage to consign the database equally a .bacpac.
#sleep for xx seconds and let SQL Server start all the style sleep 20 /opt/mssql-tools/bin/sqlcmd -l 300 -S localhost -U sa -P $SA_PASSWORD -d master -i "/create_procedure_restoreheaderonly.sql" /opt/mssql-tools/bin/sqlcmd -50 300 -Due south localhost -U sa -P $SA_PASSWORD -d primary -i "/create_procedure_restoredatabase.sql" #run the setup script to create the DB and the schema in the DB #do this in a loop considering the timing for when the SQL instance is gear up is indeterminate for f in /mnt/external/*.bak; do s=${f##*/} name="${s%.*}" extension="${s#*.}" echo "Restoring $f..." /opt/mssql-tools/bin/sqlcmd -l 300 -Southward localhost -U sa -P $SA_PASSWORD -d master -q "EXEC dbo.restoredatabase '/mnt/external/$name.$extension', '$name'" echo "Creating bacpac..." /sqlpackage/sqlpackage -a:"Export" -ssn:"localhost" -su:"sa" -sp:"$SA_PASSWORD" -sdn:"$name" -tf:"/mnt/external/$name.bacpac" done The next step is to create the Docker epitome locally. Brand sure you are in the Docker directory to run this script.
# Create Docker image locally from Dockerfile docker build -t mssql-bak-bacpac .
The last step to test locally is to create a container based on the prototype. Create a directory, C:/Docker/bak. Put a sample .bak in the directory. The SQL Server sa password must be entered here. Brand sure this countersign is secured in a secret management store (such as Azure Cardinal Vault). Yous will need to apply this sa password when you create containers based on the prototype in Azure later.
# An SA_PASSWORD must be enetered here docker run -due east "ACCEPT_EULA=Y" -east "SA_PASSWORD=" ` -v C:/Docker/bak:/mnt/external ` --proper noun bak-to-bacpac mssql-bak-bacpac
Deploying Azure resources
The next footstep is to fix the Azure infrastructure. You volition need the following resources. The Github repo contains scripts to create these.
- Resource Group
- Key Vault
- Storage function
- Storage File Share and Blob Container
- Container Registry
- SQL server
Creating the container in Azure
With the infrastructure created, the local paradigm tin can now be pushed to the Container Registry.
$RGName = 'sqlcontainers' $ACRName = 'acrsqlcontainers' $ACRPath = 'sql/bak-bacpac:latest' # Log in to registry $ACRNameObj = Get-AzContainerRegistry -ResourceGroupName $RGName -Name $ACRName $ACRCred = Get-AzContainerRegistryCredential -Registry $ACRNameObj # Call docker login, passing in password $ACRCred.Countersign | docker login $ACRNameObj.LoginServer --username $ACRCred.Username --password-stdin # Tag paradigm $ImagePath = $ACRNameObj.LoginServer + '/' + $ACRPath docker tag mssql-bak-bacpac $ImagePath # Push image to repository docker push button $ImagePath
Now the image that is stored in the Container Registry tin can be used to create a new Container Instance. This cmdlet, when run, will create the container and execute the script within it.
This is where you lot volition need to enter the sa password you used in your Docker epitome.
$SA_PASSWORD = Read-Host -Prompt "Please enter the SA password:" $RGName = 'sqlcontainers' $KVName = 'kvsqlcontainers' $ContainerGroupName = 'aci-sql-bak-bacpac' $ACRName = 'acrsqlcontainers' $ACRLoginServer = (Become-AzContainerRegistry -ResourceGroupName $RGName -Proper noun $ACRName).LoginServer $ACRUser = (Go-AzKeyVaultSecret -VaultName $KVName -Name 'acr-pull-user').SecretValueText $ACRPass = (Get-AzKeyVaultSecret -VaultName $KVName -Name 'acr-pull-pass').SecretValue $ACRCred = New-Object System.Management.Automation.PSCredential ($ACRUser, $ACRPass) $ACRPath = 'sql/bak-bacpac:latest' $EnvVariables = @{ ACCEPT_EULA="Y"; SA_PASSWORD=$SA_PASSWORD; MSSQL_PID="Enterprise";} $StorageAcctName = 'customersqlbaks' $StorageAcctKey = (Get-AzStorageAccountKey -ResourceGroupName $RGName -Name $StorageAcctName)[0].Value | ConvertTo-SecureString -AsPlainText -Strength $StorageAcctCred = New-Object System.Management.Automation.PSCredential($StorageAcctName, $StorageAcctKey) $StorageAcctFileShareName = 'baks' $VolumeMountPath = '/mnt/external' $CGExists = Get-AzContainerGroup -ResourceGroupName $RGName -Name $ContainerGroupName -ErrorAction SilentlyContinue if ($CGExists -eq $nothing) { New-AzContainerGroup ` -Name $ContainerGroupName ` -ResourceGroupName $RGName ` -Image $ACRLoginServer/$ACRPath ` -RegistryServerDomain $ACRLoginServer ` -RegistryCredential $ACRCred ` -DnsNameLabel $ContainerGroupName ` -IpAddressType Public ` -EnvironmentVariable $EnvVariables ` -AzureFileVolumeAccountCredential $StorageAcctCred ` -AzureFileVolumeShareName $StorageAcctFileShareName ` -AzureFileVolumeMountPath $VolumeMountPath ` -OsType Linux ` -Cpu 2 ` -MemoryInGB 4 Write-Host "Container grouping ($ContainerGroupName) created." } else { Write-Host "Container grouping ($ContainerGroupName) exists." } Yous tin verify that your Azure File Share now contains a .bak and a .bacpac.
Importing the database
Once the .bacpac is in your file share, all that remains is to copy the file to the blob container, and import the database.
$RGName = 'sqlcontainers' $KVName = 'kvsqlcontainers' $StorageAcctName = 'customersqlbaks' $StorageAcctKey = (Get-AzStorageAccountKey -ResourceGroupName $RGName -Name $StorageAcctName)[0].Value $StorageAcctFileShareName = 'baks' $StorageContext = (Go-AzStorageAccount -ResourceGroupName $RGName -Proper name $StorageAcctName).Context $StorageFileShareObj = Get-AzStorageFile -ShareName $StorageAcctFileShareName -Context $StorageContext $Filtered = $StorageFileShareObj | Where-Object {$_.name -like '*.bacpac'} $FileName = $Filtered.Name $SASToken = New-AzStorageAccountSASToken -Service Blob,File,Table,Queue -ResourceType Service,Container,Object -Permission "racwdlup" -Context $StorageContext $StorageUriFileShareSAS = "https://$StorageAcctName.file.core.windows.net/$StorageAcctFileShareName/$FileName$SASToken" $StorageUriBlob = "https://$StorageAcctName.hulk.core.windows.net/$StorageAcctFileShareName/$FileName" $StorageUriBlobSAS = "https://$StorageAcctName.blob.cadre.windows.net/$StorageAcctFileShareName/$FileName$SASToken" $SqlServerName = 'customerdbsfrombak' $SqlAdminUser = (Get-AzSqlServer -ResourceGroup $RGName -Name $SqlServerName).SqlAdministratorLogin $SqlAdminPass = (Become-AzKeyVaultSecret -VaultName $KVName -Name "$SqlServerName-admin").SecretValue $SQLDB = 'importedbak' $sqlEdition = 'BusinessCritical' $sqlSLO = 'BC_Gen5_2' #Motion file using azcopy azcopy login azcopy re-create $StorageUriFileShareSAS $StorageUriBlobSAS Write-Output "Importing bacpac..." $importRequest = New-AzSqlDatabaseImport ` -DatabaseName $SQLDB ` -Edition $sqlEdition ` -ServiceObjectiveName $sqlSLO ` -DatabaseMaxSizeBytes "$(10 * 1024 * 1024 * 1024)" ` -ServerName $SqlServerName ` -StorageKeyType 'StorageAccessKey' ` -StorageKey $StorageAcctKey ` -StorageUri $StorageUriBlob ` -AdministratorLogin $SqlAdminUser ` -AdministratorLoginPassword $SqlAdminPass ` -ResourceGroupName $RGName do { $importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink Offset-Sleep -s 10 } while ($importStatus.Status -eq "InProgress") Y'all can now get to your Azure SQL bract in the portal and see your database.
This solution helps solve the problem of not being able to restore a .bak file into an Azure SQL Database, only without the overhead of creating and maintaining a virtual automobile. The lawmaking tin can exist downloaded from Using an Azure Container Instance to convert BAK to BACPAC for Import into Azure SQL Database.
Photo by Enrique Hoyos from Pexels
Source: https://devblogs.microsoft.com/azure-sql/using-an-azure-container-instance-to-convert-a-bak-to-bacpac-for-import-into-azure-sql-database/
0 Response to "How to Read an Sql File Sqlcmd.exe"
Post a Comment