Importing Data into Cloud SQL

This page describes how to import data into Google Cloud SQL. To learn more about importing data, read the Overview of Importing and Exporting Data.

Before you begin

Before you can import data into a Cloud SQL instance, you must meet the following requirements:

  • Your data must be ready for import by Cloud SQL.

    • SQL dump files must have been created with the correct parameters.
    • CSV files must be in the correct format.

    For help with creating SQL dump files or CSV files ready for import into Cloud SQL, see Exporting Data for Import into Cloud SQL.

  • You must have a Cloud SQL instance, with the default user configured.

    For help with creating a Cloud SQL instance, see Creating Instances.

  • The database and table you are importing into must already exist on your Cloud SQL instance.

    For help with creating a database, see Creating a database. To create a table, use the CREATE TABLE SQL statement in the mysql client.

Importing SQL dump files to a Cloud SQL instance

To import a SQL dump file to a Cloud SQL instance:

Console

  1. Create your SQL dump file, if you haven't already, using the required parameters to ensure that your dump file is useable by Cloud SQL.

    For help with creating a SQL dump file, see Creating a SQL dump file.

  2. Create a Google Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the dump file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Go to the Cloud SQL Instances page in the Google Cloud Platform Console.

    Go to the Cloud SQL Instances page

  5. Select the instance to open its Instance details page.
  6. Click Import in the button bar.
  7. Enter the path to the bucket and SQL dump file you uploaded, or browse to it by clicking Browse and doubleclicking the name of the bucket and then selecting the file and clicking Select.

    You can use a compressed (.gz) or an uncompressed (.sql) file.

  8. Under Format, select SQL.
  9. Click Show advanced options and name the database in your Cloud SQL instance you want the data to be imported into.

    This causes Cloud SQL to run the USE DATABASE statement before the import.

  10. Click Import to start the import.

gcloud (2nd Gen)

  1. Create your SQL dump file, if you haven't already, using the required parameters to ensure that your dump file is useable by Cloud SQL.

    For help with creating a SQL dump file, see Creating a SQL dump file.

  2. Create a Google Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the dump file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Describe the instance you are importing to:
    gcloud sql instances describe [INSTANCE_NAME]
    
  5. Copy the serviceAccountEmailAddress field.
  6. Add the service account to the bucket ACL as a writer:
    gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]
    
  7. Add the service account to the import file as a reader:
    gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:R gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
    
  8. Import the database:
    gcloud sql instances import [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] --database [DATABASE_NAME]
    

    For information about using the import command, see the sql instances import command reference page.

    If you get an error like `ERROR_RDBMS`, review the permissions; this error is often due to permissions issues.

  9. Remove the service account from the file and bucket ACLs:
    gsutil acl ch -d [SERVICE_ACCOUNT_ADDRESS] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
    gsutil acl ch -d [SERVICE_ACCOUNT_ADDRESS] gs://[BUCKET_NAME]
    

    This step is strongly recommended for security.

gcloud (1st Gen)

  1. Create a Google Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the dump file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. gcloud sql instances import [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
    

    If you get an error like `ERROR_RDBMS`, you might not have the proper permissions to import from that bucket. Check that you have at least `WRITER` permissions on the bucket and `READER` permissions on the export file. For more information on configuring access control in Google Cloud Storage, see Create and Manage Access Control Lists.

    For information about using the import command, see the sql instances import command reference page.

cURL

  1. Create your SQL dump file, if you haven't already, using the required parameters to ensure that your dump file is useable by Cloud SQL.

    For help with creating a SQL dump file, see Creating a SQL dump file.

  2. Create a Google Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the dump file to a Google Cloud Storage bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Provide your instance with write permissions to your bucket, and read access to the file.

    For help with setting ACLs, see Setting ACLs.

  5. Import your dump file:
    ACCESS_TOKEN="$(gcloud auth application-default print-access-token)"
    curl --header "Authorization: Bearer ${ACCESS_TOKEN}" \
         --header 'Content-Type: application/json' \
         --data '{"importContext":
                    {"fileType": "SQL",
                     "uri": "gs://[BUCKET_NAME]/[PATH_TO_DUMP_FILE]",
                     "database": "[DATABASE_NAME]" }}' \
       -X POST \
       https://www.googleapis.com/sql/v1beta4/projects/[PROJECT-ID]/instances/[INSTANCE_NAME]/import
    
  6. Remove the ACLs you set in the previous step.

    This step is strongly recommended for security.

For the complete list of parameters for this request, see the instances:import page.

Importing CSV files

To import data to a Cloud SQL instance using a CSV file:

  1. Make sure that the CSV file you want to import into Google Cloud SQL meets the requirements described in Overview of Importing and Exporting Data.

    If your file does not meet these requirements, you can also use the LOAD DATA LOCAL INFILE statement in the mysql client, which loads a local file to the database. This requires using the --local-infile option for the mysql client.

  2. Create a Google Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the CSV file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Go to the Cloud SQL Instances page in the Google Cloud Platform Console.

    Go to the Cloud SQL Instances page

  5. Select the instance to open its Instance details page.
  6. Click Import in the button bar.
  7. Choose a CSV file from Cloud Storage to import into your Cloud SQL instance.
  8. Under Format, select CSV.
  9. Specify the Database and Table in your Cloud SQL instance where you wish to import the CSV file.

    The database and table must already exist. For help with creating a database, see Creating a database. To create a table, use the CREATE TABLE statement in the mysql client, available using the Cloud Shell.

  10. Click the Import to start the import.
To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

What's next

Monitor your resources on the go

Get the Google Cloud Console app to help you manage your projects.

Send feedback about...

Cloud SQL for MySQL