Importing data into Cloud SQL

This page describes how to import data into Cloud SQL instances. You can import data from SQL dump or CSV files.

For best practices for importing data, see Best Practices for Importing and Exporting Data.

After completing an import operation, verify the results.

Before you begin

Required roles and permissions

This procedure requires you to import a file from Cloud Storage. To import data from Cloud Storage, the service account or user must have one of the following sets of roles:

  • The Cloud SQL Admin role and the roles/storage.LegacyObjectReader IAM role
  • A custom role including the following permissions:
    • cloudsql.instances.get
    • cloudsql.instances.import
    • storage.buckets.get
    • storage.objects.get

If the account or user is also performing export operations, it might be more convenient to grant the Storage Object Admin IAM role to have all the required permissions for both import and export. For help with IAM roles, see Cloud Identity and Access Management for Cloud Storage.

Importing data from a SQL dump file into Cloud SQL

  • Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.

  • The database you are importing into must already exist on your Cloud SQL instance. For help with creating a database, see Creating a database.
  • The import operation attempts to execute the SQL commands in the sqldump file, such as deleting and recreating tables, entries and indices. For help with creating a database, see Creating a database.
  • Do not use a system user (such as root@localhost) as the DEFINER for triggers, view, or stored procedures. You won't have access to this user in Cloud SQL.
  • If your SQL dump file contains DEFINER clauses (views, triggers, or stored procedures), then depending on the ordering of when these statements are executed, the import might fail. Learn more about DEFINER usage and potential workarounds in Cloud SQL.
  • For help with creating a SQL dump file that can be imported into Cloud SQL, see Creating a SQL dump file.
  • To import data from a SQL dump file to a Cloud SQL instance:

    Console

    1. In the Google Cloud Console, go to the Cloud SQL Instances page.

      Go to Cloud SQL Instances

    2. Click the instance name to open its Overview page.
    3. Click Import.
    4. In the Choose the file you'd like to import data from section, enter the path to the bucket and SQL dump file to use for the import. Or to browse to the file:
      1. Click Browse.
      2. In the Location section, double-click the name of the bucket in the list.
      3. Select the file in the list.
      4. Click Select.

      You can import a compressed (.gz) or an uncompressed (.sql) file.

    5. For Format, select SQL.
    6. Select the database you want the data to be imported into.

      This causes Cloud SQL to run the USE DATABASE statement before the import.

    7. If you want to specify a PostgreSQL user to perform the import, select the user.

      If your import file contains statements that must be performed by a specific PostgreSQL user, use this field to specify that user.

    8. Click Import to start the import.

    gcloud

    1. Create a Cloud Storage bucket, if you haven't already.

      For help with creating a bucket, see Creating storage buckets.

    2. Upload the file to your bucket.

      For help with uploading files to buckets, see Uploading objects.

    3. Describe the instance you are importing to:
      gcloud sql instances describe INSTANCE_NAME
      
    4. Copy the serviceAccountEmailAddress field.
    5. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
        gsutil iam ch serviceAccount:SERVICE-ACCOUNT:objectAdmin \
        gs://BUCKET-NAME
        
      For help with setting IAM permissions, see Using IAM permissions.
    6. Import the database:
      gcloud sql import sql INSTANCE_NAME gs://BUCKET_NAME/IMPORT_FILE_NAME \
      --database=DATABASE_NAME
      

      For information about using the import sql command, see the sql import sql command reference page.

      If the command returns an error like `ERROR_RDBMS`, review the permissions; this error is often due to permissions issues.

    7. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

    REST v1beta4

      Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.

    1. Create a Cloud Storage bucket, if you haven't already.

      For help with creating a bucket, see Creating storage buckets.

    2. Upload the file to your bucket.

      For help with uploading files to buckets, see Uploading objects.

    3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
    4. Import your dump file:

      Before using any of the request data below, make the following replacements:

      • project-id: The project ID
      • instance-id: The instance ID
      • bucket_name: The Cloud Storage bucket name
      • path_to_sql_file: The path to the SQL file
      • database_name: The name of a database inside the Cloud SQL instance

      HTTP method and URL:

      POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

      Request JSON body:

      {
       "importContext":
         {
            "fileType": "SQL",
            "uri": "gs://bucket_name/path_to_sql_file",
            "database": "database_name"
          }
      }
      
      

      To send your request, expand one of these options:

      You should receive a JSON response similar to the following:

      For the complete list of parameters for this request, see the instances:import page.
    5. If you do not need to retain the IAM permissions you set previously, remove them now.
    To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

    Importing data from a CSV file into Cloud SQL

    • The database and table you are importing into must already exist on your Cloud SQL instance. For help with creating a database, see Creating a database. To create a table, use the CREATE TABLE SQL statement in the mysql client.
    • Your CSV file must conform to the CSV file format requirements below.

      If your file does not meet these requirements, you can also use the LOAD DATA LOCAL INFILE statement in the mysql client, which loads a local file to the database. This requires using the --local-infile option for the mysql client.

    • If you are importing data that you exported from an on-premises MySQL server:
      1. Create a bucket in Cloud Storage using the instructions in Creating storage buckets.
      2. Upload the SQL dump file to the Cloud Storage bucket using the procedure in Uploading objects.

    CSV file format requirements

    CSV files must have one line for each row of data and have comma-separated fields. The format of CSV files that Cloud SQL accepts is equivalent to using the following MySQL statements:
          LOAD DATA INFILE ... CHARACTER SET 'utf8mb4'
          FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"'.
    
    To create a CSV in this format, when you export your data from an existing database, make sure your export uses the same options or delimiters as those used by Cloud SQL for import. See Creating a CSV file for an example.

    To import data to a Cloud SQL instance using a CSV file:

    Console

    1. In the Google Cloud Console, go to the Cloud SQL Instances page.

      Go to Cloud SQL Instances

    2. Click the instance name to open its Overview page.
    3. Click Import.
    4. In the Choose the file you'd like to import data from section, enter the path to the bucket and CSV file to use for the import. Or to browse to the file:
      1. Click Browse.
      2. In the Location section, double-click the name of the bucket in the list.
      3. Select the file in the list.
      4. Click Select.

      You can import a compressed (.gz) or an uncompressed (.csv) file.

    5. In the Format section, select CSV.
    6. Specify the Database and Table in your Cloud SQL instance where you want to import the CSV file.
    7. Click the Import to start the import.

    gcloud

    1. Create a Cloud Storage bucket, if you haven't already.

      For help with creating a bucket, see Creating storage buckets.

    2. Upload the file to your bucket.

      For help with uploading files to buckets, see Uploading objects.

    3. Upload data from the CSV file to the bucket.
    4. Describe the instance you are exporting from:
      gcloud sql instances describe INSTANCE_NAME
      
    5. Copy the serviceAccountEmailAddress field.
    6. Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.
    7. Import the file:
      gcloud sql import csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \
      --database=DATABASE_NAME \
      --table=TABLE_NAME
      

      For information about using the import csv command, see the sql import csv command reference page.

    8. If you do not need to retain the IAM permissions you set previously, remove them using gsutil iam.

    REST v1beta4

    1. Create a Cloud Storage bucket, if you haven't already.

      For help with creating a bucket, see Creating storage buckets.

    2. Upload the file to your bucket.

      For help with uploading files to buckets, see Uploading objects.

    3. Provide your instance with the storage.objectAdmin IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.
    4. Import the file:

      Before using any of the request data below, make the following replacements:

      • project-id: The project ID
      • instance-id: The instance ID
      • bucket_name: The Cloud Storage bucket name
      • path_to_csv_file: The path to the CSV file
      • database_name: The name of a database inside the Cloud SQL instance
      • table_name: The name of the database table

      HTTP method and URL:

      POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import

      Request JSON body:

      {
       "importContext":
         {
            "fileType": "CSV",
            "uri": "gs://bucket_name/path_to_csv_file",
            "database": "database_name",
            "csvImportOptions":
             {
               "table": "table_name"
             }
         }
      }
      
      

      To send your request, expand one of these options:

      You should receive a JSON response similar to the following:

      For the complete list of parameters for this request, see the instances:import page.
    5. If you do not need to retain the IAM permissions you set previously, remove the permissions.

    If you get an error such as ERROR_RDBMS, ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.

    To see how the underlying REST API request is constructed for this task, see the APIs Explorer on the instances:import page.

    Troubleshooting

    Click the links in the table for details:

    For this problem... The issue might be... Try this...
    Import is taking too long. Too many active connections can interfere with import operations. Close unused connections, or restart the Cloud SQL instance before beginning an import operation.
    Import fails. Exported file may contain database users who do not yet exist. Clean up the failed database before retrying the import. Create the database users before doing the import.
    Error during import: table doesn't exist. A required table doesn't exist at the moment. Disable FOREIGN_KEY_CHECKS at beginning of import.
    Error message: ERROR 1045 (28000) at line {line_number}: Access denied for user 'cloudsqlimport'@'localhost'. There is a DEFINER in the dump file that doesn't exist in the database. See more information about DEFINER usage and potential workarounds in Cloud SQL.
    Error message: Unknown table 'COLUMN_STATISTICS' in information_schema. This happens if you use the mysqldump binary from MySQL 8.0 to dump data from a MySQL 5.7 database and import to a MySQL 8.0 database. If you dump data from a MySQL 5.7 database and import to a MySQL 8.0 database, make sure to use the mysqldump binary from MySQL 5.7. If you use the mysqldump binary from MySQL 8.0, you need to add the --column-statistics=0 flag.

    Import is taking too long

    Import is taking too long, blocking other operations.

    The issue might be

    Too many active connections can interfere with import operations. Connections consume CPU and memory, limiting the resources available.

    Things to try

    Close unused operations. Check CPU and memory usage to make sure there are plenty of resources available. The best way to ensure maximum resources for the import operation is to restart the instance before beginning the operation. A restart:

    • Closes all connections.
    • Ends any tasks that may be consuming resources.


    Import fails

    Import fails when one or more users referenced in the exported SQL dump file does not exist.

    The issue might be

    Before importing a SQL dump, all the database users who own objects or were granted permissions on objects in the dumped database must exist. If they do not, the restore fails to recreate the objects with the original ownership and/or permissions.

    Things to try

    Clean up the failed database before retrying the import. Create the database users before importing the SQL dump.


    Error during import: table doesn't exist

    An import operation fails with an error that a table doesn't exist.

    The issue might be

    Tables can have foreign key dependencies on other tables, and depending on the order of operations, one or more of those tables might not yet exist during the import operation.

    Things to try

    Add the following line at the start of the dump file:

      SET FOREIGN_KEY_CHECKS=0;
    

    Additionally, add this line at the end of the dump file:

      SET FOREIGN_KEY_CHECKS=1;
    

    These settings deactivate data integrity checks while the import operation is in progress, and reactivate them after the data is loaded. This doesn't affect the integrity of the data on the database, because the data was already validated during the creation of the dump file.


    Error message: ERROR 1045 (28000) at line {line_number}: Access denied for user 'cloudsqlimport'@'localhost'

    You see the error ERROR 1045 (28000) at line {line_number}: Access denied for user 'cloudsqlimport'@'localhost'.

    The issue might be

    The root cause is that a user in the dump file with the DEFINER clause does not exist on the database and that user is cross-referenced in the object definitions in the database.

    Things to try

    Refer to this document on importing a database with DEFINER clauses in the dump file. You may first need to create one or more users in the database.


    Error message: Unknown table 'COLUMN_STATISTICS' in information_schema

    You see the error message Unknown table 'COLUMN_STATISTICS' in information_schema.

    The issue might be

    This happens if you use the mysqldump binary from MySQL 8.0 to dump data from a MySQL 5.7 database and import to a MySQL 8.0 database.

    Things to try

    If you dump data from a MySQL 5.7 database and import to a MySQL 8.0 database, make sure to use the mysqldump binary from MySQL 5.7. If you use the mysqldump binary from MySQL 8.0, you need to add the --column-statistics=0 flag.

    What's next