Azure database maintenance using Azure automation

One of the things often overlooked with Azure SQL databases is the fact that they need love in exactly the same way as their on-premises brethren. Sure, Azure will keep an eye out for the most blatant issues, but the old adage “a penny of prevention is worth a pound of cure” still holds water. Pieter Vanhove did a great blog post a few years back, but since then Larry Silverman made some updates to the original script so I decided to revisit the whole thing and put everything in the same place. A big thank you to both Pieter and Larry for the excellent groundwork.

The trouble with Azure SQL Database is the lack of an SQL Agent to schedule the usual scripts – like Ola Hallengren’s excellent Maintenance Solution. Another issue is the lack of cross-database scripting support in Azure, making the possibility of executing Ola’s scripts from a DBA-DB somewhere impossible.

There is a way to have Ola’s script maintain each and every database on your Azure SQL Database server, but due to the lack of cross-database scripting mentioned earlier, it requires that the procedures IndexOptimize and CommandExecute as well as the table CommandLog be set up in each and every database you deploy to your server. I’d recommend putting them in their own schema to be easily distinguished from the rest of the database, and in the example below I use the schema “dba”.

Download IndexOptimize.sql, CommandExecute.sql and CommandLog.sql from Ola’s site here.

Open them up in a text editor and run a search/replace for all instances of dbo and change that to dba. This is to make sure they end up where they are supposed to. Then let’s get cooking:

1. Create a DBA schema in the Azure SQL Database.

CREATE SCHEMA dba
GO

 

2. Run the three scripts you downloaded earlier, making sure that the two procedures and the table end up in the DBA schema.

At this stage you can test the scripts by running the following code and taking a look in the dba.CommandLog table:

EXECUTE dba.IndexOptimize
@Databases = 'USER_DATABASES',
@FragmentationLow = 'INDEX_REORGANIZE',
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@LogToTable = 'Y',
@Execute = 'Y'

 

3. Let’s create an automation account.


Click “Add” to create one.

Then we input a name for the account, a resource group and a location. For performance reasons it’s best to let the automation account be placed in the same datacenter as the database itself.

With the account set up, go ahead and click on “credentials” on the left. We need to set up a stored credential to enable the runbook to log into the database server.


Set a name of the credential (this will be referenced later), input the administrative user for the database server and the password for said user.


Go ahead and click on “overview”, folloed by “runbooks”.

On the runbook blade, click “Add a runbook” and fill out the name and the runbook type. The name must be the same as in the code we’re going to paste in and the type must be PowerShell Workflow.

Paste in the following code, overwriting the text that’s already there:

<# .SYNOPSIS Perform index maintenance .DESCRIPTION This runbook uses Azure Automation to perform reindexing and statistics maintenance of all databases on a target server. As prerequisite, please create an Azure Automation credential asset that contains the username and password for the target Azure SQL DB logical server ($SqlServerName). Make sure that you have installed the scripts IndexOptimize.sql, CommandLog.sql and CommandExecute.sql of Ola Hallengren (https://ola.hallengren.com/downloads.html) Make sure to get Ola's modified scripts which work on Azure here: https://ola.hallengren.com/downloads.html .EXAMPLE SQLServerIndexMaintenance .NOTES AUTHOR: Original author Pieter Vanhove: http://pietervanhove.azurewebsites.net/?p=14137 Heavily modified by Larry Silverman, CTO, TrackAbout Very slightly modified by Alexander Arvidsson: http://www.arcticdba.se #>

workflow AzureMaint
{
param (
# Fully-qualified name of the Azure DB server
[parameter(Mandatory=$true)]
[string] $SqlServerName,

# Credentials for $SqlServerName stored as an Azure Automation credential asset
# When using in the Azure Automation UI, please enter the name of the credential asset for the "Credential" parameter
[parameter(Mandatory=$true)]
[PSCredential] $Credential
)

inlinescript {
# Set up credentials
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
$databases = @()

# Create connection to Master DB
Try {
$MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$MasterDatabaseConnection.ConnectionString = "Server = $ServerName; Database = Master; User ID = $UserId; Password = $Password;"
$MasterDatabaseConnection.Open();

# Create command to query the name of active databases in $ServerName
$MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$MasterDatabaseCommand.Connection = $MasterDatabaseConnection
$MasterDatabaseCommand.CommandText =
"
select name from sys.databases
where state_desc='ONLINE'
and name <> 'master'
"

$MasterDbResult = $MasterDatabaseCommand.ExecuteReader()

while($MasterDbResult.Read()) {
$databases += @($MasterDbResult[0].ToString())
}
}
# Catch errors connecting to master database.
Catch {
Write-Error $_
}
Finally {
if ($null -ne $MasterDatabaseConnection) {
$MasterDatabaseConnection.Close()
$MasterDatabaseConnection.Dispose()
}
}

# Create connection for each individual database
# Iterate through each database under $ServerName
foreach ($DbName in $databases) {
Try {
# Setup connection string for $DbName
$ChildDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$ChildDatabaseConnection.ConnectionString = "Server=$ServerName; Database=$DbName; User ID=$UserId; Password=$Password;"
$ChildDatabaseConnection.Open();

# Create command for a specific database $DBName
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$DatabaseCommand.Connection = $ChildDatabaseConnection

Write-Output "Performing index and statistics maintenance on $DbName"

$DatabaseCommand.CommandText ="
EXECUTE dba.IndexOptimize
@Databases = '" + $DbName + "',
@FragmentationLow = NULL,
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@OnlyModifiedStatistics = 'Y',
@LogToTable = 'Y'
"
$DatabaseCommand.CommandTimeout = 0
# Write-Output $DatabaseCommand.CommandText
$NonQueryResult = $DatabaseCommand.ExecuteNonQuery()
}

# Inner catch for individual database failures.
# We want to keep processing the next database.
Catch {
Write-Error $_
}

Finally {
if ($null -ne $ChildDatabaseConnection)
{
$ChildDatabaseConnection.Close()
$ChildDatabaseConnection.Dispose()
}
}
}
}
}

 

I’ve made some very, very slight modifications to the script that Larry Silverman in turn modified from Pieter Vanhove’s original script. My modifications consist of changing schema name from DBO to DBA and turning on logging.


At this stage it’s a good idea to test te runbook, so click “Test pane” up top. Input the parameters (the fully qualified server name and the name of the credential we set up previously) and make sure everything runs as it should.

If everything worked as expected, let’s go ahead and schedule the runbook.

Click schedule.

This part looks a lot like scheduling a job via the agent.

This is the only difference from using the agent. Input the parameters required (database server and credential for the server administrator) and you’re good to go.

A word of caution: as each and every database will hold it’s own copy of the scripts and table, it’s imperative to keep them at the same version. I.e, when updating the scripts in one database, make sure you do it on all the other databases on the same server.

6 replies
  1. Klaas Vandenberghe says:

    Hi Alexander

    Thank you for this article. I’m just starting with Azure SQL Database and this is very helpful.
    Do you still use this the way it is written or do you have any modifications in the past 9 months?

    What would be your thoughts on including the first steps in the runbook?
    I mean: check if the schema, table and SP’s exist and if not create them, execute the indexoptimize as described and in the end drop the SP’s.
    First, it eliminates any separate work in advance, and secondly, this way we can be sure the version of the SP’s is always the one we want.
    Or are there downsides on this approach?

    Also, do you execute the integritycheck the same way?

    Klaas

    Reply
    • alexander
      alexander says:

      Klaas,

      That was a very interesting idea! To answer your first question: no, there have been no substantial changes that I’m aware of that would lead me to change how I run maintenance.
      Your second question about checking for existence and creating the SPs: from the top of my head I can’t think of any reason apart from security this would be a bad idea per se. I mean, the user used is already a highly privileged user (that might pose an issue if the user in question didn’t have the necessary privileges) and could just as easily go grab the code from somewhere and executing it. It would be a shame if someone was to tinker with the SP prior to execution though 😛

      And yes, I run the integrity check the same way.

      Reply
  2. Ben says:

    “A word of caution: as each and every database will hold it’s own copy of the scripts and table, it’s imperative to keep them at the same version. I.e, when updating the scripts in one database, make sure you do it on all the other databases on the same server.”

    Why not nominate one database to hold the stored procedures or create a separate database on the same server for that purpose? That way you only need one copy and which databases a given task runs against can be passed as a parameter.

    Reply
    • alexander
      alexander says:

      I’m not entirely sure I understand what you mean; cross-database scripting is still not supported on the Azure SQL Database (that I know of, at least)?

      Reply

Trackbacks & Pingbacks

  1. […] for the best blog posts that I could find on the subject, and the one I LOVED the most was here: Arctic DBA.  He broke it down so simply, that I finally created my own pseudo installer and I wanted to share […]

  2. […] for the best blog posts that I could find on the subject, and the one I LOVED the most was here: Arctic DBA.  He broke it down so simply, that I finally created my own pseudo installer and I wanted to share […]

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *