Importing data into Cloud SQL

This page describes how to import data into Cloud SQL instances. You can import data from SQL dump or CSV files.

For best practices for importing data, see Best Practices for Importing and Exporting Data.

To import data from Cloud Storage, the instance's service account must have the legacyBucketReader IAM role set in the project. For more information, see Cloud Identity and Access Management for Cloud Storage.

Use the same SQL Mode for import and export

The SQL Mode setting affects how Cloud SQL interprets SQL queries. For example, if you export from a database without Strict SQL enabled, then try to import to Cloud SQL (which enables Strict SQL by default), the import might fail. The best practice is to use the same SQL Mode on import that you used for export. Review the SQL Mode on both the source and target databases for compatibility. Pay particular attention to the flags that enable Strict SQL mode. If Strict SQL is NOT set on your database, you will likely want to remove it in Cloud SQL. If you remove Strict SQL, you must set another flag. To verify that your Cloud SQL instance has the desired mode set, run SELECT @@GLOBAL.sql_mode;.

Importing data from a SQL dump file into Cloud SQL

Before you begin

  • This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the Cloud SQL instance's service account or the user must have the Storage Object Admin (storage.objectAdmin) IAM role set. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

    You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool:

    gsutil iam get gs://[BUCKET_NAME]

    Learn more about using IAM with buckets.

  • Create a SQL dump file. Use instructions from the bullet below that applies to your situation. These instructions set certain flags that make the dump file compatible with Cloud SQL.

  • Do not use a system user (such as root@localhost) as the DEFINER for triggers, view, or stored procedures. You won't have access to this user in Cloud SQL. For help with creating a SQL dump file that can be imported into Cloud SQL, see Creating a SQL dump file.
  • The import operation attempts to execute the SQL commands in the sqldump file, such as deleting and recreating tables, entries and indices. For help with creating a database, see Creating a database.

Importing data from a SQL dump file in Cloud Storage

To import data from a SQL dump file to a Cloud SQL instance:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Select the instance to open its Overview page.
  3. Click Import in the button bar.
  4. Under Choose the file you'd like to import data from, enter the path to the bucket and SQL dump file to use for the import. Or to browse to the file:
    1. Click Browse.
    2. Under Location, double-click the name of the bucket in the list.
    3. Select the file in the list.
    4. Click Select.

    You can import a compressed (.gz) or an uncompressed (.sql) file.

  5. For Format, select SQL.
  6. Select the database you want the data to be imported into.

    This causes Cloud SQL to run the USE DATABASE statement before the import.

  7. Click Import to start the import.

gcloud

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Describe the instance you are importing to:
    gcloud sql instances describe [INSTANCE_NAME]
    
  4. Copy the serviceAccountEmailAddress field.
  5. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
      gsutil iam ch serviceAccount:[SERVICE-ACCOUNT]:objectAdmin \
      gs://[BUCKET-NAME]
      
    For help with setting IAM permissions, see Using IAM permissions.
  6. Import the database:
    gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \
                                --database=[DATABASE_NAME]
    

    For information about using the import sql command, see the sql import sql command reference page.

    If the command returns an error like `ERROR_RDBMS`, review the permissions; this error is often due to permissions issues.

  7. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

REST v1beta4

    Create a SQL dump file. Use instructions from the bullet below that applies to your situation. These instructions set certain flags that make the dump file compatible with Cloud SQL.

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  4. Import your dump file:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_sql_file: The path to the SQL file
    • database_name: The name of a database inside the Cloud SQL instance

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "SQL",
          "uri": "gs://bucket_name/path_to_sql_file",
          "database": "database_name"
        }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the IAM permissions you set previously, remove them now.
To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

Importing data from CSV files into Cloud SQL

Before you begin

  • This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the Cloud SQL instance's service account or the user must have the Storage Object Admin (storage.objectAdmin) IAM role set. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

    You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool:

    gsutil iam get gs://[BUCKET_NAME]

    Learn more about using IAM with buckets.

  • The database and table you are importing into must already exist on your Cloud SQL instance. For help with creating a database, see Creating a database. To create a table, use the CREATE TABLE SQL statement in the mysql client.
  • Your CSV file must conform to the CSV file format requirements below.

    If your file does not meet these requirements, you can also use the LOAD DATA LOCAL INFILE statement in the mysql client, which loads a local file to the database. This requires using the --local-infile option for the mysql client.

  • If you are importing data that you exported from your on-premises MySQL server:
    1. Create a bucket in Cloud Storage using the instructions in Creating storage buckets.
    2. Upload the SQL dump file to the Cloud Storage bucket using the procedure in Uploading objects.

CSV file format requirements

CSV files must have one line for each row of data and have comma-separated fields.

The format of CSV files that Cloud SQL accepts is equivalent to using the following MySQL statements:

  LOAD DATA INFILE ... CHARACTER SET 'utf8mb4'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"'.

To create a CSV in this format, when you export your data from an existing database, make sure your export uses the same options or delimiters as those used by Cloud SQL for import. See Creating a CSV file for an example.

Importing data from a CSV file in Cloud Storage

To import data to a Cloud SQL instance using a CSV file:

Console

  1. Go to the Cloud SQL Instances page in the Google Cloud Console.

    Go to the Cloud SQL Instances page

  2. Select the instance to open its Overview page.
  3. Click Import in the button bar.
  4. Under Choose the file you'd like to import data from, enter the path to the bucket and CSV file to use for the import. Or to browse to the file:
    1. Click Browse.
    2. Under Location, double-click the name of the bucket in the list.
    3. Select the file in the list.
    4. Click Select.

    You can import a compressed (.gz) or an uncompressed (.csv) file.

  5. Under Format, select CSV.
  6. Specify the Database and Table in your Cloud SQL instance where you want to import the CSV file.
  7. You can optionally specify a user to use for the import.
  8. Click the Import to start the import.

gcloud

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Upload data from the CSV file to the bucket.
  4. Describe the instance you are exporting from:
    gcloud sql instances describe [INSTANCE_NAME]
    
  5. Copy the serviceAccountEmailAddress field.
  6. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.
  7. Import the file:
    gcloud sql import csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
                                --database=[DATABASE_NAME] --table=[TABLE_NAME]
    

    For information about using the import csv command, see the sql import csv command reference page.

  8. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

REST v1beta4

  1. Create a Cloud Storage bucket, if you haven't already.

    For help with creating a bucket, see Creating Storage Buckets.

  2. Upload the file to your bucket.

    For help with uploading files to buckets, see Uploading objects.

  3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
  4. Import the file:

    Before using any of the request data below, make the following replacements:

    • project-id: The project ID
    • instance-id: The instance ID
    • bucket_name: The Cloud Storage bucket name
    • path_to_csv_file: The path to the CSV file
    • database_name: The name of a database inside the Cloud SQL instance
    • table_name: The name of the database table

    HTTP method and URL:

    POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

    Request JSON body:

    {
     "importContext":
       {
          "fileType": "CSV",
          "uri": "gs://bucket_name/path_to_csv_file",
          "database": "database_name",
          "csvImportOptions":
           {
             "table": "table_name"
           }
       }
    }
    
    

    To send your request, expand one of these options:

    You should receive a JSON response similar to the following:

    For the complete list of parameters for this request, see the instances:import page.
  5. If you do not need to retain the IAM permissions you set previously, remove the permissions.

If you get an error such as ERROR_RDBMS, ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.

To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

What's next