Using Microsoft SQL Server backups for point-in-time recovery on Compute Engine

In this tutorial, you perform backups on a Compute Engine SQL Server instance. The tutorial shows you how to manage these backups and store them in Cloud Storage and how to restore a database to a point in time.

This tutorial is useful if you are a sysadmin, developer, engineer, database admin, or devops engineer who wants to back up SQL Server data.

The tutorial assumes that you are familiar with the following:

Objectives

  • Launch a SQL Server instance and create a database.
  • Perform full, differential, and transaction log backups.
  • Upload the backups to Cloud Storage.
  • Restore the database from a Cloud Storage backup.

Costs

This tutorial uses the following billable components of Google Cloud Platform:

You can use the pricing calculator to generate a cost estimate based on your projected usage. New GCP users might be eligible for a free trial.

Before you begin

  1. Sign in to your Google Account.

    If you don't already have one, sign up for a new account.

  2. Select or create a GCP project.

    Go to the Project selector page

  3. Make sure that billing is enabled for your Google Cloud Platform project.

    Learn how to enable billing

  4. Enable the Compute Engine API.

    Enable the API

When you finish this tutorial, you can avoid continued billing by deleting the resources you created. For more information, see Cleaning up.

Preparing the SQL Server instance

In this section, you launch the SQL Server instance, prepare the database, and configure an encryption key.

Launch the SQL Server instance

Your first task is to launch a SQL Server instance and create the backup folder.

  1. Open Cloud Shell:

    GO TO Cloud Shell

  2. Launch a SQL Server instance:

    gcloud compute instances create sqlserver \
        --zone=us-central1-c \
        --machine-type=n1-standard-1 \
        --image-family=sql-std-2016-win-2016 \
        --image-project=windows-sql-cloud \
        --boot-disk-size=50GB \
        --boot-disk-type=pd-standard \
        --tags=sqlserver \
        --scopes=https://www.googleapis.com/auth/cloud-platform
    
  3. Install the Chrome RDP for GCP extension.

  4. Go to the VM instances page in the GCP Console and find the Windows instance you want to connect to:

    GO TO THE VM INSTANCES PAGE

  5. Set the initial password for the instance. Store the password in a safe place.

  6. Click the RDP button for the instance you want to connect to. The Chrome RDP extension opens.

  7. Enter your username and password (leave the Domain field blank), and then click OK to connect.

    In the Chrome RDP extension, enter your username and password

    You are prompted to accept the certificate.

  8. If you accept the terms, click Continue.

  9. In the instance, minimize all windows, and then open Google Cloud SDK Shell (not the same as Cloud Shell) on the Windows desktop.

  10. Create a backup folder:

    mkdir c:\backup
    
  11. Create a restore folder:

    mkdir c:\restore
    

Prepare the database

  1. In Google Cloud SDK Shell, on the instance, create a test database:

    osql -E -Q "create database testdb"
    
  2. Create a test table:

    osql -E -Q "create table testdb.dbo.testtable(status varchar(255))"
    

Configure the encryption key

  1. In Google Cloud SDK Shell, create a master database key:

    osql -E -Q "USE master;CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Master1!';"
    
  2. Create a backup certificate:

    osql -E -Q "USE master; CREATE CERTIFICATE testdbcert WITH SUBJECT = 'testdb certificate';"
    

Performing backups

In this section, you create full, differential, and transaction log backups while changing the database between each one.

  1. Add a row to the test table:

    osql -E -Q "insert into testdb.dbo.testtable ([status]) VALUES ('Initial')"
    
  2. In Google Cloud SDK Shell, perform a full backup:

    osql  -E -Q "BACKUP DATABASE testdb TO DISK='c:\backup\testdb.bak' WITH INIT,
        COMPRESSION,
        ENCRYPTION
        (
            ALGORITHM = AES_256,
            SERVER CERTIFICATE = testdbcert
        )  "
    
  3. Add a row to the test table:

    osql -E -Q "insert into testdb.dbo.testtable ([status]) VALUES ('After Full Backup')"
    
  4. Perform a differential backup:

    osql  -E -Q "BACKUP DATABASE testdb TO DISK='c:\backup\testdb-diff.bak' WITH DIFFERENTIAL,
    COMPRESSION,
    ENCRYPTION
        (
        ALGORITHM = AES_256,
        SERVER CERTIFICATE = testdbcert
        ) "
    
  5. Add a row to the test table:

    osql -E -Q "insert into testdb.dbo.testtable ([status]) VALUES ('After Diff Backup')"
    
  6. Perform a transaction log backup:

    osql  -E -Q "BACKUP LOG testdb TO DISK='c:\backup\testdb-log.bak' WITH COMPRESSION,
    ENCRYPTION
        (
        ALGORITHM = AES_256,
        SERVER CERTIFICATE = testdbcert
        ) "
    
  7. Add a row to the test table:

    osql -E -Q "insert into testdb.dbo.testtable ([status]) VALUES ('Bad Row')"
    
  8. Verify the rows in the table:

    osql -E -Q  "select * from testdb.dbo.testtable"
    

    The output looks like this:

    Initial
    
     After Full Backup
     After Diff Backup
     Bad Row
    

Managing your backups

In this section, you store your backups remotely in Cloud Storage, configure your Cloud Storage bucket to prune old backups automatically, and schedule your backups. If you're running multiple database servers, consider creating multiple directories in your Cloud Storage bucket to represent different servers.

Upload your backups to Cloud Storage

Now that you have a few backup files, you can upload your backups to Cloud Storage.

  1. In Cloud Shell, create a Cloud Storage bucket. Bucket names must be globally unique across GCP. To ensure that your bucket name is unique, consider namespacing it with your project name, as shown here:

    gsutil mb "gs://${DEVSHELL_PROJECT_ID}-sql-backups"
    
  2. In the shell window in the RDP session, copy your files to your Cloud Storage bucket. In the following command, replace [BUCKET_NAME] with the name of the bucket you just created.

    gsutil cp -n c:\backup\testdb*.bak gs://[BUCKET_NAME]
    

    You can use the gsutil cp command to create entire directory structures and to upload multiple files at a time.

Set up automatic file pruning in Cloud Storage

Older backups eventually outlive their usefulness, so you need to remove them. To help automate this process, Cloud Storage has a lifecycle management mechanism that you can use to manage the lifecycle of your backup files.

To configure lifecycle management for the objects in your bucket:

  1. In Cloud Shell, create a JSON lifecycle configuration file. This file instructs Cloud Storage to delete files after 30 days:

    bash -c 'cat <<EOF >  lifecycle.json
    {
        "lifecycle": {
            "rule": [{
                "action": { "type": "Delete" },
                "condition": { "age": 30 }
            }]
        }
    }
    EOF'
    
  2. Set the lifecycle configuration for your Cloud Storage bucket. Replace [BUCKET_NAME] with the name of your bucket:

    gsutil lifecycle set lifecycle.json gs://[BUCKET_NAME]
    

Schedule your backups

In general, it's a good practice to take a full backup periodically and perform differential backups until the next full backup. On Windows, one way to schedule backups is by using scheduled tasks.

If you create a backup script that takes a series of backups, make sure to include some logical validation steps at each point to verify successful completion. If validation fails, make sure the script raises a Windows alert. In addition, to avoid filling up the local disk, make sure the script removes the local backup file after successfully uploading to Cloud Storage.

Restoring from backup

In this section, you restore your SQL Server database from backup files that you stored in Cloud Storage.

  1. In Google Cloud SDK Shell in your RDP session, download your backup files from Cloud Storage. Replace [BUCKET_NAME] with the name of your SQL Server backup storage bucket:

    gsutil cp gs://[BUCKET_NAME]/testdb*.bak c:\restore
    
  2. Open the SQL Server Management console.

  3. Click the Start button, and then click Microsoft SQL Server Tools 17 > Microsoft SQL Server Management Studio 17.

  4. Leave the Connection fields as is, and then click Connect.

  5. In the left-hand pane, expand Databases.

  6. Right-click testdb, and in pop-up the menu, click Tasks > Restore > Database.

  7. Import the backup files to the console:

    1. For Source, select Device.
    2. Click the more_horiz button.
    3. In the dialog that opens, click Add, select all the files under C:\restore, and then click OK.
    4. Click OK.
  8. To view point-in-time restore options:

    1. Click Timeline.
    2. Under Restore to, click Specific date and time.
    3. For Timeline Interval, select hour.
    4. Choose a time in the Transaction Log Backup range.
    5. Click Cancel to leave the timeline screen.

  9. In this tutorial, you restore the database to its state before the transaction log backup. In order to do that, in the backups list, clear the transaction log row:

    Clear the transaction log row in the backups list

  10. Start the restore process:

    1. Click OK.
    2. Wait for the database to finish the restore process, and when you see the message, Database 'testdb' restored successfully, click OK.
  11. List the rows in the test table:

    osql -E -Q "select * from testdb.dbo.testtable;"
    

    The output shows two rows:

    "Initial
    "After Full Backup"
    

    You see all the lines you inserted into the table before taking the transaction log backup.

Cleaning up

Delete the project

  1. In the GCP Console, go to the Projects page.

    Go to the Projects page

  2. In the project list, select the project you want to delete and click Delete .
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next

Was this page helpful? Let us know how we did:

Send feedback about...