Azure database maintenance using Azure automation

One of the things often overlooked with Azure SQL databases is the fact that they need love in exactly the same way as their on-premises brethren. Sure, Azure will keep an eye out for the most blatant issues, but the old adage “a penny of prevention is worth a pound of cure” still holds water. Pieter Vanhove did a great blog post a few years back, but since then Larry Silverman made some updates to the original script so I decided to revisit the whole thing and put everything in the same place. A big thank you to both Pieter and Larry for the excellent groundwork.

The trouble with Azure SQL Database is the lack of an SQL Agent to schedule the usual scripts – like Ola Hallengren’s excellent Maintenance Solution. Another issue is the lack of cross-database scripting support in Azure, making the possibility of executing Ola’s scripts from a DBA-DB somewhere impossible.

There is a way to have Ola’s script maintain each and every database on your Azure SQL Database server, but due to the lack of cross-database scripting mentioned earlier, it requires that the procedures IndexOptimize and CommandExecute as well as the table CommandLog be set up in each and every database you deploy to your server. I’d recommend putting them in their own schema to be easily distinguished from the rest of the database, and in the example below I use the schema “dba”.

Download IndexOptimize.sql, CommandExecute.sql and CommandLog.sql from Ola’s site here.

Open them up in a text editor and run a search/replace for all instances of dbo and change that to dba. This is to make sure they end up where they are supposed to. Then let’s get cooking:

1. Create a DBA schema in the Azure SQL Database.

CREATE SCHEMA dba
GO

 

2. Run the three scripts you downloaded earlier, making sure that the two procedures and the table end up in the DBA schema.

At this stage you can test the scripts by running the following code and taking a look in the dba.CommandLog table:

EXECUTE dba.IndexOptimize
@Databases = 'USER_DATABASES',
@FragmentationLow = 'INDEX_REORGANIZE',
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@LogToTable = 'Y',
@Execute = 'Y'

 

3. Let’s create an automation account.


Click “Add” to create one.

Then we input a name for the account, a resource group and a location. For performance reasons it’s best to let the automation account be placed in the same datacenter as the database itself.

With the account set up, go ahead and click on “credentials” on the left. We need to set up a stored credential to enable the runbook to log into the database server.


Set a name of the credential (this will be referenced later), input the administrative user for the database server and the password for said user.


Go ahead and click on “overview”, folloed by “runbooks”.

On the runbook blade, click “Add a runbook” and fill out the name and the runbook type. The name must be the same as in the code we’re going to paste in and the type must be PowerShell Workflow.

Paste in the following code, overwriting the text that’s already there:

<# .SYNOPSIS Perform index maintenance .DESCRIPTION This runbook uses Azure Automation to perform reindexing and statistics maintenance of all databases on a target server. As prerequisite, please create an Azure Automation credential asset that contains the username and password for the target Azure SQL DB logical server ($SqlServerName). Make sure that you have installed the scripts IndexOptimize.sql, CommandLog.sql and CommandExecute.sql of Ola Hallengren (https://ola.hallengren.com/downloads.html) Make sure to get Ola's modified scripts which work on Azure here: https://ola.hallengren.com/downloads.html .EXAMPLE SQLServerIndexMaintenance .NOTES AUTHOR: Original author Pieter Vanhove: http://pietervanhove.azurewebsites.net/?p=14137 Heavily modified by Larry Silverman, CTO, TrackAbout Very slightly modified by Alexander Arvidsson: http://www.arcticdba.se #>

workflow AzureMaint
{
param (
# Fully-qualified name of the Azure DB server
[parameter(Mandatory=$true)]
[string] $SqlServerName,

# Credentials for $SqlServerName stored as an Azure Automation credential asset
# When using in the Azure Automation UI, please enter the name of the credential asset for the "Credential" parameter
[parameter(Mandatory=$true)]
[PSCredential] $Credential
)

inlinescript {
# Set up credentials
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
$databases = @()

# Create connection to Master DB
Try {
$MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$MasterDatabaseConnection.ConnectionString = "Server = $ServerName; Database = Master; User ID = $UserId; Password = $Password;"
$MasterDatabaseConnection.Open();

# Create command to query the name of active databases in $ServerName
$MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$MasterDatabaseCommand.Connection = $MasterDatabaseConnection
$MasterDatabaseCommand.CommandText =
"
select name from sys.databases
where state_desc='ONLINE'
and name <> 'master'
"

$MasterDbResult = $MasterDatabaseCommand.ExecuteReader()

while($MasterDbResult.Read()) {
$databases += @($MasterDbResult[0].ToString())
}
}
# Catch errors connecting to master database.
Catch {
Write-Error $_
}
Finally {
if ($null -ne $MasterDatabaseConnection) {
$MasterDatabaseConnection.Close()
$MasterDatabaseConnection.Dispose()
}
}

# Create connection for each individual database
# Iterate through each database under $ServerName
foreach ($DbName in $databases) {
Try {
# Setup connection string for $DbName
$ChildDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$ChildDatabaseConnection.ConnectionString = "Server=$ServerName; Database=$DbName; User ID=$UserId; Password=$Password;"
$ChildDatabaseConnection.Open();

# Create command for a specific database $DBName
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$DatabaseCommand.Connection = $ChildDatabaseConnection

Write-Output "Performing index and statistics maintenance on $DbName"

$DatabaseCommand.CommandText ="
EXECUTE dba.IndexOptimize
@Databases = '" + $DbName + "',
@FragmentationLow = NULL,
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@OnlyModifiedStatistics = 'Y',
@LogToTable = 'Y'
"
$DatabaseCommand.CommandTimeout = 0
# Write-Output $DatabaseCommand.CommandText
$NonQueryResult = $DatabaseCommand.ExecuteNonQuery()
}

# Inner catch for individual database failures.
# We want to keep processing the next database.
Catch {
Write-Error $_
}

Finally {
if ($null -ne $ChildDatabaseConnection)
{
$ChildDatabaseConnection.Close()
$ChildDatabaseConnection.Dispose()
}
}
}
}
}

 

I’ve made some very, very slight modifications to the script that Larry Silverman in turn modified from Pieter Vanhove’s original script. My modifications consist of changing schema name from DBO to DBA and turning on logging.


At this stage it’s a good idea to test te runbook, so click “Test pane” up top. Input the parameters (the fully qualified server name and the name of the credential we set up previously) and make sure everything runs as it should.

If everything worked as expected, let’s go ahead and schedule the runbook.

Click schedule.

This part looks a lot like scheduling a job via the agent.

This is the only difference from using the agent. Input the parameters required (database server and credential for the server administrator) and you’re good to go.

A word of caution: as each and every database will hold it’s own copy of the scripts and table, it’s imperative to keep them at the same version. I.e, when updating the scripts in one database, make sure you do it on all the other databases on the same server.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *