This page describes how to import data into Cloud SQL instances. You can import data from SQL dump or CSV files.
For best practices for importing data, see Best Practices for Importing and Exporting Data.
After completing an import operation, verify the results.
Importing data from a SQL dump file into Cloud SQL
Before you begin
- Make sure you have configured the required roles and permissions.
-
Create a SQL dump file. Use instructions from the bullet below that applies to your situation. These instructions set certain flags that make the dump file compatible with Cloud SQL.
- If you are importing data from an on-premises PostgreSQL server:
- Create a SQL dump file using the instructions in Exporting data using pg_dump.
- Create a bucket in Cloud Storage using the instructions in Creating storage buckets.
- Upload the SQL dump file to the Cloud Storage bucket using the procedure in Uploading objects.
- If you are importing data from another Cloud SQL instance, see the instructions in Exporting data from Cloud SQL to a SQL dump file.
- Before importing a SQL dump, ensure that all the users who own objects or were granted permissions on objects in the dumped database exist in the new database.
If they do not, the import will fail to recreate the objects with the original ownership and/or permissions. (Sometimes this is what you want, but usually it
is not.)
For help creating users, see Creating users.
Importing data from a SQL dump file in Cloud Storage
To import data from a SQL dump file to a Cloud SQL instance:
Console
- Go to the Cloud SQL Instances page in the Google Cloud Console.
- Select the instance to open its Overview page.
- Click Import in the button bar.
- Under Choose the file you'd like to import data from, enter the path
to the bucket and SQL dump file to use for the import. Or to browse to the
file:
- Click Browse.
- Under Location, double-click the name of the bucket in the list.
- Select the file in the list.
- Click Select.
You can import a compressed (
.gz
) or an uncompressed (.sql
) file. - For Format, select SQL.
Select the database you want the data to be imported into.
This causes Cloud SQL to run the
USE DATABASE
statement before the import.If you want to specify a PostgreSQL user to perform the import, select the user.
If your import file contains statements that must be performed by a specific PostgreSQL user, use this field to specify that user.
- Click Import to start the import.
gcloud
-
Create a Cloud Storage bucket, if you haven't already.
For help with creating a bucket, see Creating Storage Buckets.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Describe the instance you are importing to:
gcloud sql instances describe [INSTANCE_NAME]
- Copy the
serviceAccountEmailAddress
field. - Use
gsutil iam
to grant thestorage.objectAdmin
IAM role to the service account for the bucket.gsutil iam ch serviceAccount:[SERVICE-ACCOUNT]:objectAdmin \ gs://[BUCKET-NAME]
For help with setting IAM permissions, see Using IAM permissions. - Import the database:
gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \ --database=[DATABASE_NAME]
For information about using the
import sql
command, see thesql import sql
command reference page.If the command returns an error like `ERROR_RDBMS`, review the permissions; this error is often due to permissions issues.
- If you do not need to retain the IAM permissions you
set previously, remove them using
gsutil iam
.
REST v1beta4
- If you are importing data from an on-premises PostgreSQL server:
- Create a SQL dump file using the instructions in Exporting data using pg_dump.
- Create a bucket in Cloud Storage using the instructions in Creating storage buckets.
- Upload the SQL dump file to the Cloud Storage bucket using the procedure in Uploading objects.
- If you are importing data from another Cloud SQL instance, see the instructions in Exporting data from Cloud SQL to a SQL dump file.
-
Create a Cloud Storage bucket, if you haven't already.
For help with creating a bucket, see Creating Storage Buckets.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Import your dump file:
Before using any of the request data below, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_sql_file: The path to the SQL file
- database_name: The name of a database inside the Cloud SQL instance
HTTP method and URL:
POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
To use a different user for the import, specify the
For the complete list of parameters for this request, see the instances:import page.importContext.importUser
property. - If you do not need to retain the IAM permissions you set previously, remove them now.
Create a SQL dump file. Use instructions from the bullet below that applies to your situation. These instructions set certain flags that make the dump file compatible with Cloud SQL.
Importing data from CSV files into Cloud SQL
Before you begin
- Make sure you have configured the required roles and permissions.
- The database and table you are importing into must already exist on your
Cloud SQL instance. For help with creating a database, see
Creating a database.
To create a table in the database, use the
CREATE TABLE
SQL statement in thepsql
client. - Your CSV file must conform to the CSV file format requirements below.
CSV file format requirements
CSV files must have one line for each row of data and use comma-separated fields.
To see instructions for exporting to a CSV file formatted for Cloud SQL, see Exporting data from Cloud SQL to a CSV file.
Importing data from a CSV file in Cloud Storage
To import data to a Cloud SQL instance using a CSV file:
Console
- Go to the Cloud SQL Instances page in the Google Cloud Console.
- Select the instance to open its Overview page.
- Click Import in the button bar.
- Under Choose the file you'd like to import data from, enter the
path to the bucket and CSV file to use for the import. Or to browse to the
file:
- Click Browse.
- Under Location, double-click the name of the bucket in the list.
- Select the file in the list.
- Click Select.
You can import a compressed (
.gz
) or an uncompressed (.csv
) file. - Under Format, select CSV.
- Specify the Database and Table in your Cloud SQL instance where you want to import the CSV file.
- You can optionally specify a user to use for the import.
- Click the Import to start the import.
gcloud
-
Create a Cloud Storage bucket, if you haven't already.
For help with creating a bucket, see Creating Storage Buckets.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Upload data from the CSV file to the bucket.
- Describe the instance you are exporting from:
gcloud sql instances describe [INSTANCE_NAME]
- Copy the
serviceAccountEmailAddress
field. - Use
gsutil iam
to grant thestorage.objectAdmin
IAM role to the service account for the bucket. For help with setting IAM permissions, see Using IAM permissions. - Import the file:
gcloud sql import csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \ --database=[DATABASE_NAME] --table=[TABLE_NAME]
For information about using the
import csv
command, see thesql import csv
command reference page. - If you do not need to retain the IAM permissions you
set previously, remove them using
gsutil iam
.
REST v1beta4
-
Create a Cloud Storage bucket, if you haven't already.
For help with creating a bucket, see Creating Storage Buckets.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Import data from the file:
Before using any of the request data below, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_csv_file: The path to the CSV file
- database_name: The name of a database inside the Cloud SQL instance
- table_name: The name of the database table
HTTP method and URL:
POST https://www.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": "database_name", "csvImportOptions": { "table": "table_name" } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
To use a different user for the import, specify the
For the complete list of parameters for this request, see the instances:import page.importContext.importUser
property. - If you do not need to retain the IAM permissions you set previously, remove the permissions.
If you get an error such as ERROR_RDBMS
, ensure the table
exists. If the table exists, confirm that you have the correct
permissions on the bucket. For help configuring access control in
Cloud Storage, see
Create and Manage Access Control Lists.
Importing data into Cloud SQL using pg_restore
You can use the pg_restore
utility to import a database into
Cloud SQL. pg_restore
only works with archive files
created by pg_dump
.
Learn more about
pg_restore
.
If the dump file was created with plain-text format, run the following command:
pg_restore -h CLOUD_SQL_INSTANCE_IP -U USERNAME --format=plain --no-owner --no-acl -d DATABASE_NAME SQL_FILE.sql
If the dump file was created with custom format, run the following command:
pg_restore --list DATABASE_NAME .dmp | sed -E 's/(.* EXTENSION )/; \1/g' > DATABASE_NAME .toc
The sed
post-processing comments out all extension statements
in the SQL dump file.
To import in parallel, use the -j NUM_CORES
flag.
NUM_CORES is the number of cores on the source instance. Use the
same flag with pg_dump
to
export in parallel.
Required roles and permissions
This procedure requires you to import a file from Cloud Storage. To import
data from Cloud Storage, the Cloud SQL instance's
service account or the user must have the
Cloud SQL Admin
role or a custom role including the cloudsql.instances.import
permission, and the roles/storage.LegacyObjectReader
IAM role (which has the storage.objects.get
permission).
If the account or user is also performing export operations, grant the `Storage Object Admin`
(storage.objectAdmin
) IAM role set. For help with
IAM roles, see Cloud Identity and Access Management for Cloud Storage.
You can find the instance's service account name in the Google Cloud Console on your instance's Overview page. You can verify the roles for your Cloud Storage bucket by using the gsutil tool:
gsutil iam get gs://[BUCKET_NAME]
Learn more about using IAM with buckets.
Troubleshooting
Click the links in the table for details:
For this problem... | The issue might be... | Try this... |
---|---|---|
Import is taking too long. | Too many active connections can interfere with import operations. | Close unused connections, or restart the Cloud SQL instance before beginning an import operation. |
Import fails. | Exported file may contain database users who do not yet exist. | Clean up the failed database before retrying the import. Create the database users before doing the import. |
Import is taking too long
Import is taking too long, blocking other operations.
The issue might be
Too many active connections can interfere with import operations. Connections consume CPU and memory, limiting the resources available.
Things to try
Close unused operations. Check CPU and memory usage to make sure there are plenty of resources available. The best way to ensure maximum resources for the import operation is to restart the instance before beginning the operation. A restart:
- Closes all connections.
- Ends any tasks that may be consuming resources.
Import fails
Import fails when one or more users referenced in the exported SQL dump file does not exist.
The issue might be
Before importing a SQL dump, all the database users who own objects or were granted permissions on objects in the dumped database must exist. If they do not, the restore fails to recreate the objects with the original ownership and/or permissions.
Things to try
Clean up the failed database before retrying the import. Create the database users before importing the SQL dump.
What's next
- Learn how to check the status of import and export operations.
- Learn more about importing and exporting data.
- Learn more about Cloud Storage.