Migrating data between SQL Server 2017 and Cloud SQL for SQL Server using backup files

In this tutorial, you migrate data from SQL Server 2017 Enterprise running on Compute Engine to Cloud SQL for SQL Server 2017 Enterprise. The tutorial shows you how to export the data from SQL Server 2017, import the data to Cloud SQL for SQL Server 2017, and validate that the data was successfully imported.

If you want to migrate data from SQL Server 2008, see Migrating data between SQL Server 2008 and Cloud SQL for SQL Server using backup files.

This tutorial is useful if you are a sysadmin, developer, engineer, database admin, or DevOps engineer who wants to migrate data from SQL Server 2017 to Cloud SQL for SQL Server.

The tutorial assumes that you're familiar with the following:

Objectives

  • Create a SQL Server virtual machine (VM) instance on Compute Engine to host the sample database.
  • Create a Cloud SQL for SQL Server instance.
  • Create a Cloud Storage bucket.
  • Back up the sample database.
  • Import the database into Cloud SQL for SQL Server.
  • Validate the imported data.

Costs

This tutorial uses the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

When you finish this tutorial, you can avoid continued billing by deleting the resources you created. For more information, see Cleaning up.

Before you begin

  1. Sign in to your Google Account.

    If you don't already have one, sign up for a new account.

  2. In the Cloud Console, on the project selector page, select or create a Cloud project.

    Go to the project selector page

  3. Make sure that billing is enabled for your Google Cloud project. Learn how to confirm billing is enabled for your project.

  4. Enable the Cloud Storage, Cloud SQL Admin, and Compute Engine APIs.

    Enable the APIs

  5. In the Cloud Console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Cloud Console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Cloud SDK already installed, including the gcloud command-line tool, and with values already set for your current project. It can take a few seconds for the session to initialize.

Creating a script for a sample database

In this section, you create a script to populate a pre-loaded sample database. Then you create a SQL Server 2017 instance on Compute Engine that uses this database.

  1. In Cloud Shell, create a PowerShell script to deploy as a startup script:

    cat << 'EOF' > startup.ps1
    [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
    $dataseturl = "https://opendata.maryland.gov/api/views/ryxx-aeaf/rows.csv?accessType=DOWNLOAD"
    Invoke-WebRequest -Uri $dataseturl -OutFile c:\dataset.csv
    Invoke-Sqlcmd -Query "CREATE DATABASE testdb;"
    Invoke-Sqlcmd -Query "CREATE TABLE testdb.dbo.MarylandZipCodes ( zipcode int NOT NULL PRIMARY KEY,  city varchar(255), county varchar(255))"
    Invoke-Sqlcmd -Query "BULK INSERT testdb.dbo.MarylandZipCodes FROM 'c:\dataset.csv' WITH (DATAFILETYPE='char',FIELDTERMINATOR=',' ,ROWTERMINATOR = '0x0a',  FIRSTROW=2);"
    EOF
    

    This startup script creates a local database named testdb with a table called MarylandZipCodes. Then the script downloads the Maryland public zip code registry and imports the registry into the table.

  2. Create a SQL Server 2017 Enterprise instance for Windows Server 2016:

    gcloud compute instances create sql-server --machine-type n1-highmem-4 \
        --boot-disk-size 50GB \
      --image-project windows-sql-cloud --image-family sql-ent-2017-win-2016 \
      --zone us-central1-f  \
      --scopes=https://www.googleapis.com/auth/cloud-platform \
      --metadata-from-file windows-startup-script-ps1=startup.ps1
    

    For this tutorial, you create the instance in the us-central1-f zone with a boot disk size of 50 GB. For more information about zones, see Cloud locations. The --metadata-from-file flag sets the PowerShell script to be the startup script for the instance.

Creating the Cloud SQL instance and Cloud Storage bucket

  1. In Cloud Shell, create the Cloud SQL for SQL Server 2017 Enterprise instance that you later migrate the database to:

    gcloud beta sql instances create target  \
        --database-version=SQLSERVER_2017_ENTERPRISE \
        --cpu=2 \
        --memory=5GB \
        --root-password=sqlserver12@ \
        --zone=us-central1-f
    

    It can take a few minutes to create the instance. The default root username is sqlserver with a default password of sqlserver12@. For this tutorial, you create the instance in the us-central1-f zone. For more information about zones, see Cloud locations.

  2. Create a Cloud Storage bucket to store the backup file before the data is imported into Cloud SQL:

    gsutil mb -l US "gs://${DEVSHELL_PROJECT_ID}-sql-backups"
    

Backing up the sample database

In this section, you connect to the SQL Server virtual machine (VM), create a backup of the database, and upload the backup database to Cloud Storage.

  1. In Cloud Shell, reset the password for the SQL Server VM instance:

    gcloud compute reset-windows-password sql-server --zone=us-central1-f
    

    Make a note of the new password for the user.

  2. In the browser, install the Chrome RDP for Google Cloud extension.

  3. In the Cloud Console, go to the VM instances page.

    GO TO THE VM INSTANCES PAGE

  4. To open the Chrome RDP Extension, click the RDP button for the sql-server VM instance that you want to connect to.

  5. In the username and password fields, enter the username and password that you previously made a note of for the SQL Server VM instance.

  6. Leave the Domain field blank, and then click OK to connect to the SQL Server VM.

    When you're prompted, accept the certificate.

  7. If you accept the terms, click Continue.

  8. In the browser, minimize all windows, and then launch the PowerShell command-line tool.

  9. Create a backup folder:

    mkdir c:\backup
    
  10. In the browser, create a backup of the database in the backup folder:

    osql -E -Q "BACKUP DATABASE testdb TO DISK='c:\backup\testdb.bak'"
    
  11. In the shell in the instance, copy the backup file to the Cloud Storage bucket:

    $PROJECT_ID=(gcloud config get-value core/project)
    
    gsutil cp -n c:\backup\testdb.bak gs://$PROJECT_ID-sql-backups
    

Importing the backup file into Cloud SQL

  1. In Cloud Shell, retrieve the service account that's associated with the Cloud SQL instance and save it in a variable:

    SVC_EMAIL_ADDRESS=$(gcloud sql instances describe target \
        --format='value(serviceAccountEmailAddress)')
    echo $SVC_EMAIL_ADDRESS
    

    Google Cloud creates a service account when you created the Cloud SQL instance. You use the service account to give the Cloud SQL instance permission to access the resources it needs.

  2. Add the service account to the Cloud Storage bucket's access-control list (ACL) with writer permissions:

    gsutil acl ch -u ${SVC_EMAIL_ADDRESS}:W gs://${DEVSHELL_PROJECT_ID}-sql-backups
    
  3. Add the service account to the import file with reader permissions:

    gsutil acl ch -u $SVC_EMAIL_ADDRESS:R gs://${DEVSHELL_PROJECT_ID}-sql-backups/testdb.bak
    
  4. Import the backup file to the Cloud SQL database:

    gcloud beta sql import bak target \
        gs://${DEVSHELL_PROJECT_ID}-sql-backups/testdb.bak --database testdb
    

Validating the data import

In this section, you check that the sample data was successfully imported.

  1. In Cloud Shell, install the SQL Server toolkit:

    sudo apt-get install -y mssql-tools
    

    If you accept the license terms, enter yes when you're prompted.

  2. To securely access your Cloud SQL instance, download the Cloud SQL proxy:

    wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
    
  3. Start the Cloud SQL proxy:

    CONNECTION_NAME=$(gcloud sql instances describe target --format='value(connectionName)')
    ./cloud_sql_proxy -instances=${CONNECTION_NAME}=tcp:1433 &
    
  4. Read the number of rows in the Cloud SQL table:

    /opt/mssql-tools/bin/sqlcmd -U sqlserver -S 127.0.0.1 -Q "select count(*)
    from testdb.dbo.MarylandZipCodes"
    

    When you're prompted, enter the sqlserver12@ password. In the output, make sure that the result is 619.

Cleaning up

The easiest way to eliminate billing is to delete the Cloud project you created for the tutorial.

Delete the project

  1. In the Cloud Console, go to the Manage resources page.

    Go to the Manage resources page

  2. In the project list, select the project that you want to delete and then click Delete .
  3. In the dialog, type the project ID and then click Shut down to delete the project.

What's next