Migrating data between SQL Server 2017 and Cloud SQL for SQL Server using backup files


In this tutorial, you migrate data from SQL Server 2017 Enterprise running on Compute Engine to Cloud SQL for SQL Server 2017 Enterprise. The tutorial shows you how to export the data from SQL Server 2017, import the data to Cloud SQL for SQL Server 2017, and validate that the data was successfully imported.

If you want to migrate data from SQL Server 2008, see Migrating data between SQL Server 2008 and Cloud SQL for SQL Server using backup files.

This tutorial is useful if you are a sysadmin, developer, engineer, database admin, or DevOps engineer who wants to migrate data from SQL Server 2017 to Cloud SQL for SQL Server.

The tutorial assumes that you're familiar with the following:

Objectives

  • Create a SQL Server virtual machine (VM) instance on Compute Engine to host the sample database.
  • Create a Cloud SQL for SQL Server instance.
  • Create a Cloud Storage bucket.
  • Back up the sample database.
  • Import the database into Cloud SQL for SQL Server.
  • Validate the imported data.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

When you finish the tasks that are described in this document, you can avoid continued billing by deleting the resources that you created. For more information, see Clean up.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. Enable the Cloud Storage, Cloud SQL Admin, and Compute Engine APIs.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Make sure that billing is enabled for your Google Cloud project.

  7. Enable the Cloud Storage, Cloud SQL Admin, and Compute Engine APIs.

    Enable the APIs

  8. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

Creating a script for a sample database

In this section, you create a script to populate a pre-loaded sample database. Then you create a SQL Server 2017 instance on Compute Engine that uses this database.

  1. In Cloud Shell, create a PowerShell script to deploy as a startup script:

    cat << 'EOF' > startup.ps1
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
    $dataseturl = "https://opendata.maryland.gov/api/views/ryxx-aeaf/rows.csv?accessType=DOWNLOAD"
    Invoke-WebRequest -Uri $dataseturl -OutFile c:\dataset.csv
    Invoke-Sqlcmd -Query "CREATE DATABASE testdb;"
    Invoke-Sqlcmd -Query "CREATE TABLE testdb.dbo.MarylandZipCodes ( zipcode int NOT NULL PRIMARY KEY,  city varchar(255), county varchar(255))"
    Invoke-Sqlcmd -Query "BULK INSERT testdb.dbo.MarylandZipCodes FROM 'c:\dataset.csv' WITH (DATAFILETYPE='char',FIELDTERMINATOR=',' ,ROWTERMINATOR = '0x0a',  FIRSTROW=2);"
    EOF
    

    This startup script creates a local database named testdb with a table called MarylandZipCodes. Then the script downloads the Maryland public zip code registry and imports the registry into the table.

  2. Create a SQL Server 2017 Enterprise instance for Windows Server 2016:

    gcloud compute instances create sql-server --machine-type n1-highmem-4 \
        --boot-disk-size 50GB \
      --image-project windows-sql-cloud --image-family sql-ent-2017-win-2016 \
      --zone us-central1-f  \
      --scopes=https://www.googleapis.com/auth/cloud-platform \
      --metadata-from-file windows-startup-script-ps1=startup.ps1
    

    For this tutorial, you create the instance in the us-central1-f zone with a boot disk size of 50 GB. For more information about zones, see Cloud locations. The --metadata-from-file flag sets the PowerShell script to be the startup script for the instance.

Creating the Cloud SQL instance and Cloud Storage bucket

  1. In Cloud Shell, create the Cloud SQL for SQL Server 2017 Enterprise instance that you later migrate the database to:

    gcloud beta sql instances create target  \
        --database-version=SQLSERVER_2017_ENTERPRISE \
        --cpu=2 \
        --memory=5GB \
        --root-password=sqlserver12@ \
        --zone=us-central1-f
    

    It can take a few minutes to create the instance. The default root username is sqlserver with a default password of sqlserver12@. For this tutorial, you create the instance in the us-central1-f zone. For more information about zones, see Cloud locations.

  2. Create a Cloud Storage bucket to store the backup file before the data is imported into Cloud SQL:

    gsutil mb -l US "gs://${DEVSHELL_PROJECT_ID}-sql-backups"
    

Backing up the sample database

In this section, you connect to the SQL Server virtual machine (VM), create a backup of the database, and upload the backup database to Cloud Storage.

  1. In Cloud Shell, reset the password for the SQL Server VM instance:

    gcloud compute reset-windows-password sql-server --zone=us-central1-f
    

    Make a note of the new password for the user.

  2. Install a Remote Desktop Protocol (RDP) client. For more information, see Microsoft Remote Desktop clients.

  3. In the Google Cloud console, go to the VM instances page.

    GO TO THE VM INSTANCES PAGE

  4. In the Compute Engine section of the Google Cloud console, click the RDP dropdown and select the Download the RDP file option to download the RDP file for your SQL Server VM instance.

    Use this file to connect to the instance using an RDP client. For more information, see Microsoft Remote Desktop clients.

  5. In the browser, minimize all windows, and then launch the PowerShell command-line tool.

  6. Create a backup folder:

    mkdir c:\backup
    
  7. In the browser, create a backup of the database in the backup folder:

    osql -E -Q "BACKUP DATABASE testdb TO DISK='c:\backup\testdb.bak'"
    
  8. In the shell in the instance, copy the backup file to the Cloud Storage bucket:

    $PROJECT_ID=(gcloud config get-value core/project)
    
    gsutil cp -n c:\backup\testdb.bak gs://$PROJECT_ID-sql-backups
    

Importing the backup file into Cloud SQL

  1. In Cloud Shell, retrieve the service account that's associated with the Cloud SQL instance and save it in a variable:

    SVC_EMAIL_ADDRESS=$(gcloud sql instances describe target \
        --format='value(serviceAccountEmailAddress)')
    echo $SVC_EMAIL_ADDRESS
    

    Google Cloud creates a service account when you created the Cloud SQL instance. You use the service account to give the Cloud SQL instance permission to access the resources it needs.

  2. Add the service account to the Cloud Storage bucket's access-control list (ACL) with writer permissions:

    gsutil acl ch -u ${SVC_EMAIL_ADDRESS}:W gs://${DEVSHELL_PROJECT_ID}-sql-backups
    
  3. Add the service account to the import file with reader permissions:

    gsutil acl ch -u $SVC_EMAIL_ADDRESS:R gs://${DEVSHELL_PROJECT_ID}-sql-backups/testdb.bak
    
  4. Import the backup file to the Cloud SQL database:

    gcloud beta sql import bak target \
        gs://${DEVSHELL_PROJECT_ID}-sql-backups/testdb.bak --database testdb
    

Validating the data import

In this section, you check that the sample data was successfully imported.

  1. In Cloud Shell, install the SQL Server toolkit:

    sudo apt-get install -y mssql-tools
    

    If you accept the license terms, enter yes when you're prompted.

  2. To securely access your Cloud SQL instance, download the Cloud SQL proxy:

    wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
    
  3. Start the Cloud SQL proxy:

    CONNECTION_NAME=$(gcloud sql instances describe target --format='value(connectionName)')
    ./cloud_sql_proxy -instances=${CONNECTION_NAME}=tcp:1433 &
    
  4. Read the number of rows in the Cloud SQL table:

    /opt/mssql-tools/bin/sqlcmd -U sqlserver -S 127.0.0.1 -Q "select count(*)
    from testdb.dbo.MarylandZipCodes"
    

    When you're prompted, enter the sqlserver12@ password. In the output, make sure that the result is 619.

Clean up

The easiest way to eliminate billing is to delete the Google Cloud project you created for the tutorial.

Delete the project

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

What's next