Importing Data into Cloud SQL

This page describes how to import data into Cloud SQL. For best practices for importing data, see Best Practices for Importing and Exporting Data.

Importing SQL dump files to Cloud SQL

Before you begin

  • SQL dump files must have been created with the correct parameters.

    For help with creating SQL dump files ready for import into Cloud SQL, see Exporting Data.

  • The database you are importing into must already exist on your Cloud SQL instance.

    For help with creating a database, see Creating a database.

Importing a SQL dump file

To import a SQL dump file to a Cloud SQL instance:

Console

  1. Create your SQL dump file, if you haven't already, using the instructions in Exporting Data to ensure that your dump file is useable by Cloud SQL.

  2. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the CSV file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Go to the Cloud SQL Instances page in the Google Cloud Platform Console.

    Go to the Cloud SQL Instances page

  5. Select the instance to open its Instance details page.
  6. Click Import in the button bar.
  7. Enter the path to the bucket and SQL dump file you uploaded, or browse to it by clicking Browse and doubleclicking the name of the bucket and then selecting the file and clicking Select.

    You can use a compressed (.gz) or an uncompressed (.sql) file.

  8. For Format, select SQL.
  9. Select the database you want the data to be imported into.

    This causes Cloud SQL to run the USE DATABASE statement before the import.

  10. If you want to specify a PostgreSQL user to perform the import, select the user.

    If your import file contains statements that must be performed by a specific PostgreSQL user, use this field to specify that user.

  11. Click Import to start the import.

gcloud

  1. Create your SQL dump file, if you haven't already, using the instructions in Exporting Data to ensure that your dump file is useable by Cloud SQL.

  2. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the CSV file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Describe the instance you are importing to:
    gcloud sql instances describe [INSTANCE_NAME]
    
  5. Copy the serviceAccountEmailAddress field.
  6. Add the service account to the bucket ACL as a writer:
    gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]
    
  7. Add the service account to the import file as a reader:
    gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:R gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
    
  8. Import the database:
    gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \
                                --database=[DATABASE_NAME]
    

    For information about using the import sql command, see the sql import sql command reference page.

    If the command returns an error like `ERROR_RDBMS`, review the permissions; this error is often due to permissions issues.

  9. If you do not need to retain the permissions provided by the ACLs you set previously, remove them now:
    gsutil acl ch -d [SERVICE_ACCOUNT_ADDRESS] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
    gsutil acl ch -d [SERVICE_ACCOUNT_ADDRESS] gs://[BUCKET_NAME]
    

cURL

  1. Create your SQL dump file, if you haven't already, using the instructions in Exporting Data to ensure that your dump file is useable by Cloud SQL.

  2. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  3. Upload the CSV file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  4. Provide your instance with write permissions to your bucket, and read access to the file.

    For help with setting ACLs, see Setting ACLs.

  5. Import your dump file:
    ACCESS_TOKEN="$(gcloud auth application-default print-access-token)"
    curl --header "Authorization: Bearer ${ACCESS_TOKEN}" \
         --header 'Content-Type: application/json' \
         --data '{"importContext":
                    {"fileType": "SQL",
                     "uri": "gs://[BUCKET_NAME]/[PATH_TO_DUMP_FILE]",
                     "database": "[DATABASE_NAME]" }}' \
       -X POST \
       https://www.googleapis.com/sql/v1beta4/projects/[PROJECT-ID]/instances/[INSTANCE_NAME]/import
    

    To use a different user for the import, specify the importContext.importUser property.

    For the complete list of parameters for this request, see the instances:import page.
  6. If you do not need to retain the permissions provided by the ACLs you set previously, remove them now.

Importing CSV files to Cloud SQL

Before you begin

  • The database and table you are importing into must already exist on your Cloud SQL instance. For help with creating a database, see Creating a database. To create a table, use the CREATE TABLE SQL statement in the psql client.

  • Your CSV file must conform to the CSV file format requirements below.

CSV file format requirements

CSV files must have one line for each row of data and have comma-separated fields.

Importing the CSV file

To import data to a Cloud SQL instance using a CSV file:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Platform Console.

    Go to the Cloud SQL Instances page

  2. Select the instance to open its Instance details page.
  3. Click Import in the button bar.
  4. Choose the CSV file you uploaded for import into your Cloud SQL instance.
  5. Under Format, select CSV.
  6. Specify the Database and Table in your Cloud SQL instance where you wish to import the CSV file.
  7. You can optionally specify a user to use for the import.
  8. Click the Import to start the import.

gcloud

  1. Create a bucket for the import:

    gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
    

    This step is not required, but strongly recommended, so you do not open up access to any other data.

  2. Upload the CSV file to the bucket.
  3. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  4. Copy the serviceAccountEmailAddress field.
  5. Add the service account to the bucket ACL as a writer:
    gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]
    
  6. Add the service account to the import file as a reader:
    gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:R gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
    
  7. Import the file:
    gcloud sql import csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME] --table=[TABLE_NAME]
    

    For information about using the import csv command, see the sql import csv command reference page.

  8. If you do not need to retain the permissions provided by the ACL you set previously, remove the ACL:
    gsutil acl ch -d [SERVICE_ACCOUNT_ADDRESS] gs://[BUCKET_NAME]
    

cURL

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the CSV file to your bucket.

    For help with uploading files to buckets, see Uploading an Object.

  3. Provide your instance with read permissions to your bucket.

    For help with setting ACLs, see Setting ACLs.

  4. Import the file:
    ACCESS_TOKEN="$(gcloud auth application-default print-access-token)"
    curl --header "Authorization: Bearer ${ACCESS_TOKEN}" \
         --header 'Content-Type: application/json' \
         --data '{"importContext":
                      {"fileType": "CSV",
                       "uri": "gs://<BUCKET_NAME>/<PATH_TO_DUMP_FILE>",
                       "database": "<DATABASE_NAME>",
                       "csvImportOptions":
                            {"table": "<TABLE_NAME>"}}}' \
       -X POST \
       https://www.googleapis.com/sql/v1beta4/projects/[PROJECT-ID]/instances/[INSTANCE_NAME]/import
    

    To use a different user for the import, specify the importContext.importUser property.

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the permissions provided by the ACL you set previously, remove the ACL.

If you get an error like ERROR_RDBMS, make sure the table exists. If that is not the problem, confirm that you have the correct permissions on the bucket. For more information about configuring access control in Cloud Storage, see Create and Manage Access Control Lists.

What's next

Was this page helpful? Let us know how we did:

Send feedback about...

Cloud SQL for PostgreSQL