This page describes exporting and importing data into Cloud SQL instances using CSV files.
Before you begin
Before you begin an export or import operation:
- Ensure that your database has adequate free space.
- Verify that the CSV file has the expected data and that it's in the correct format. CSV files must have one line for each row of data fields.
- Follow the best practices for exporting and importing data.
Export and import operations use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.
Export data from Cloud SQL for MySQL
Required roles and permissions for exporting from Cloud SQL for MySQL
To export data from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
- The Cloud SQL Editor role
- A custom role,
including the following permissions:
cloudsql.instances.get
cloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
Identity and Access Management (IAM) role - A custom role, including the following permissions:
storage.objects.create
storage.objects.list
(for exporting files in parallel only)storage.objects.delete
(for exporting files in parallel only)
For help with IAM roles, see Identity and Access Management.
Export data to a CSV file from Cloud SQL for MySQL
You can export your data in CSV format, which is usable by other tools and environments. Exports happen at the database level. During a CSV export, you can specify the schemas to export. All schemas at the database level are eligible for export.
To export data from a database on a Cloud SQL instance to a CSV file in a Cloud Storage bucket:
Console
-
In the Google Cloud console, go to the Cloud SQL Instances page.
- To open the Overview page of an instance, click the instance name.
- Click Export.
- Select Offload export to allow other operations to occur while the export is ongoing.
- Click Show advanced options.
- In the Database section, select the name of the database from the drop-down menu.
-
For SQL query, enter a SQL query to specify the table to export data from.
For example, to export the entire contents of the
entries
table in theguestbook
database, you enterSELECT * FROM guestbook.entries;
Your query must specify a table in the specified database. You can't export an entire database in CSV format. - Click Export to start the export.
- The Export database? box opens with a message that the export process can take an hour or more for large databases. During the export, the only operation you can perform on the instance is viewing information. After the export starts, you can cancel the operation. If this is a good time to start an export, click Export. Otherwise, click Cancel.
gcloud
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Find the service account for the Cloud SQL instance you're exporting
from. You can do this running the
gcloud sql instances describe
command. Look for theserviceAccountEmailAddress
field in the output.gcloud sql instances describe INSTANCE_NAME
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the Cloud SQL instance service account. For help with setting IAM permissions, see Using IAM permissions. - Export the database:
gcloud sql export csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --offload \ --query=SELECT_QUERY
For information about using the
export csv
command, see thesql export csv
command reference page. - If you do not need to retain the IAM role you set previously, revoke it now.
REST v1
- Create a bucket for the export:
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step is not required, but strongly recommended, so you do not open up access to any other data.
- Provide your instance with the
legacyBucketWriter
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. -
Export your database:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_csv_file: The path to the CSV file
- database_name: The name of a database inside the Cloud SQL instance
- offload: Enables serverless export. Set to
true
to use serverless export. - select_query: SQL query for export (optional)
- escape_character: The character that should appear before a data character that needs to be escaped. The value of this argument has to be a character in Hex ASCII Code. For example, "22" represents double quotes. (optional)
- quote_character:The character that encloses values from columns that have a string data type. The value of this argument has to be a character in Hex ASCII Code. For example, "22" represents double quotes. (optional)
- fields_terminated_by: The character that split column values. The value of this argument has to be a character in Hex ASCII Code. For example, "2C" represents a comma. (optional)
- lines_terminated_by: The character that split line records. The value of this argument has to be a character in Hex ASCII Code. For example, "0A" represents a new line. (optional)
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "databases": ["database_name"], "offload": true | false "csvExportOptions": { "selectQuery":"select_query", "escapeCharacter":"escape_character", "quoteCharacter":"quote_character", "fieldsTerminatedBy":"fields_terminated_by", "linesTerminatedBy":"lines_terminated_by" } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
If your select query specifies a database, it overrides the
databases
property. - If you do not need to retain the IAM permissions you set previously, remove them now.
REST v1beta4
- Create a bucket for the export:
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step is not required, but strongly recommended, so you do not open up access to any other data.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. -
Export your database:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_csv_file: The path to the CSV file
- database_name: The name of a database inside the Cloud SQL instance
- offload: Enables serverless export. Set to
true
to use serverless export. - select_query: SQL query for export (optional)
- escape_character: The character that should appear before a data character that needs to be escaped. The value of this argument must be in ASCII hex format. For example, "22" represents double quotes. (optional)
- quote_character: The character that encloses values from columns that have a string data type. The value of this argument must be in ASCII hex format. For example, "22" represents double quotes. (optional)
- fields_terminated_by: The character that splits column values. The value of this argument must be in ASCII hex format. For example, "2C" represents a comma. (optional)
- lines_terminated_by: The character that split line records. The value of this argument must be in ASCII hex format. For example, "0A" represents a new line. (optional)
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "databases": ["database_name"], "offload": true | false "csvExportOptions": { "selectQuery": "select_query", "escapeCharacter": "escape_character", "quoteCharacter": "quote_character", "fieldsTerminatedBy": "fields_terminated_by", "linesTerminatedBy": "lines_terminated_by" } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
If your select query specifies a database, it overrides the
databases
property. - If you do not need to retain the IAM role you set previously, revoke it now.
Customize the format of a CSV export file
You can use gcloud
or the REST API to customize your CSV file format. When you
perform an export, you can specify the following formatting options:
CSV option | Default value | gcloud flag | REST API property | Description |
---|---|---|---|---|
Escape |
ASCII hex code for file separator. |
--escape |
escapeCharacter |
Character that appears before a data character that needs to be escaped. Available only for MySQL and PostgreSQL. |
Quote |
ASCII hex code for double quotes. |
--quote |
quoteCharacter |
Character that encloses values from columns that have a string data type. Available only for MySQL and PostgreSQL. |
Field delimiter |
ASCII hex code for comma. |
--fields-terminated-by |
fieldsTerminatedBy |
Character that splits column values. Available only for MySQL and PostgreSQL. |
Newline character |
ASCII hex code for newline. |
--lines-terminated-by |
linesTerminatedBy |
Character that splits line records. Available only for MySQL. |
For example, a gcloud
command using all of these arguments could be
like the following:
gcloud sql export csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --offload \ --query=SELECT_QUERY \ --quote="22" \ --escape="5C" \ --fields-terminated-by="2C" \ --lines-terminated-by="0A"
The equivalent REST API request body would look like this:
{ "exportContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "databases": ["DATABASE_NAME"], "offload": true, "csvExportOptions": { "selectQuery": "SELECT_QUERY", "escapeCharacter": "5C", "quoteCharacter": "22", "fieldsTerminatedBy": "2C", "linesTerminatedBy": "0A" } } }
The preceding gcloud
and API examples are equivalent to running
the following SQL statement:
SELECT [QUERY] INTO OUTFILE ... CHARACTER SET 'utf8mb4' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\\' LINES TERMINATED BY '\n'
Import data to Cloud SQL for MySQL
Required roles and permissions for importing to Cloud SQL for MySQL
To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:
- The Cloud SQL Admin role
- A custom role,
including the following permissions:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
- The
storage.objectAdmin
IAM role - A custom role, including the following permissions:
storage.objects.get
storage.objects.list
(for importing files in parallel only)
For help with IAM roles, see Identity and Access Management.
Export data from your local MySQL server to a CSV file
To export a MySQL table for import into Cloud SQL, we recommend that you format the file using the following command:
mysql --host=INSTANCE_IP --user=USER_NAME --password DATABASE \ -e " SELECT * FROM TABLE INTO OUTFILE 'FILE_NAME' CHARACTER SET 'utf8mb4' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' "
You can customize the format of your CSV export file.
Import data from a CSV file to Cloud SQL for MySQL
- The database and table you are importing into must exist on your
Cloud SQL instance.
For help with creating a database, see
Creating a database.
To create a table, use the
CREATE TABLE
SQL statement in themysql
client. - Your CSV file must conform to the CSV file format requirements.
- If you are importing data that you exported from an on-premises MySQL
server:
- Create a bucket in Cloud Storage.
- Upload the SQL dump file to the Cloud Storage bucket.
CSV file format requirements
CSV files must have one line for each row of data and use comma-separated fields.
To import data to a Cloud SQL instance using a CSV file:
Console
-
In the Google Cloud console, go to the Cloud SQL Instances page.
- To open the Overview page of an instance, click the instance name.
- Click Import.
- In the Choose the file you'd like to import data from section, enter the
path to the bucket and CSV file to use for the import. Or to browse to the
file:
- Click Browse.
- In the Location section, double-click the name of the bucket in the list.
- Select the file in the list.
- Click Select.
You can import a compressed (
.gz
) or an uncompressed (.csv
) file. - In the Format section, select CSV.
- Specify the Database and Table in your Cloud SQL instance where you want to import the CSV file.
- Click the Import to start the import.
gcloud
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Upload data from the CSV file to the bucket.
- Identify the service account for the Cloud SQL instance you're exporting
from. You can do this running the
gcloud sql instances describe
command with the instance name. Look for theserviceAccountEmailAddress
field in the output.gcloud sql instances describe INSTANCE_NAME
- Copy the serviceAccountEmailAddress field.
- Use
gcloud storage buckets add-iam-policy-binding
to grant thestorage.objectAdmin
IAM role to the Cloud SQL instance service account for the bucket. For help with setting IAM permissions, see Using IAM permissions. - Import the file:
gcloud sql import csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --table=TABLE_NAME
For information about using the
import csv
command, see thesql import csv
command reference page. - If you do not need to retain the IAM permissions you
set previously, remove them using
gcloud storage buckets remove-iam-policy-binding
.
REST v1
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
legacyBucketWriter
andobjectViewer
IAM roles for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Import the file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_csv_file: The path to the CSV file
- database_name: The name of a database inside the Cloud SQL instance
- table_name: The name of the database table
- escape_character: The character that should appear before a data character that needs to be escaped. The value of this argument has to be a character in Hex ASCII Code. For example, "22" represents double quotes. (optional)
- quote_character: The character that encloses values from columns that have a string data type. The value of this argument has to be a character in Hex ASCII Code. For example, "22" represents double quotes. (optional)
- fields_terminated_by: The character that split column values. The value of this argument has to be a character in Hex ASCII Code. For example, "2C" represents a comma. (optional)
- lines_terminated_by: The character that split line records. The value of this argument has to be a character in Hex ASCII Code. For example, "0A" represents a new line. (optional)
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": "database_name", "csvImportOptions": { "table": "table_name", "escapeCharacter": "escape_character", "quoteCharacter": "quote_character", "fieldsTerminatedBy": "fields_terminated_by", "linesTerminatedBy": "lines_terminated_by" } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
For the complete list of parameters for the request, see the instances:import page. - If you do not need to retain the IAM permissions you set previously, remove the permissions.
REST v1beta4
- Create a Cloud Storage bucket.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
- Provide your instance with the
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions. - Import the file:
Before using any of the request data, make the following replacements:
- project-id: The project ID
- instance-id: The instance ID
- bucket_name: The Cloud Storage bucket name
- path_to_csv_file: The path to the CSV file
- database_name: The name of a database inside the Cloud SQL instance
- table_name: The name of the database table
- escape_character: The character that should appear before a data character that needs to be escaped. The value of this argument must be in ASCII hex format. For example,"22" represents double quotes. (optional)
- quote_character: The character that encloses values from columns that have a string data type. The value of this argument must be in ASCII hex format. For example, "22" represents double quotes. (optional)
- fields_terminated_by: The character that split column values. The value of this argument must be in ASCII hex format. For example, "2C" represents a comma. (optional)
- lines_terminated_by: The character that split line records. The value of this argument must be in ASCII hex format. For example, "0A" represents a new line. (optional)
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": "database_name", "csvImportOptions": { "table": "table_name", "escapeCharacter": "escape_character", "quoteCharacter": "quote_character", "fieldsTerminatedBy": "fields_terminated_by", "linesTerminatedBy": "lines_terminated_by" } } }
To send your request, expand one of these options:
You should receive a JSON response similar to the following:
For the complete list of parameters for the request, see the instances:import page. - If you do not need to retain the IAM permissions you set previously, remove the permissions.
Customize the format of a CSV file for Cloud SQL for MySQL
You can use gcloud
or the REST API to customize your CSV file format.
A sample gcloud
command follows:
gcloud sql import csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --table=TABLE_NAME \ --quote="22" \ --escape="5C" \ --fields-terminated-by="2C" \ --lines-terminated-by="0A"
The equivalent REST API request body would look like this:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": ["DATABASE_NAME"], "csvImportOptions": { "table": "TABLE_NAME", "escapeCharacter": "5C", "quoteCharacter": "22", "fieldsTerminatedBy": "2C", "linesTerminatedBy": "0A" } } }
The preceding gcloud
and API examples are equivalent to running
the following SQL statement:
LOAD DATA LOCAL INFILE ... CHARACTER SET 'utf8mb4'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\\'.
If you get an error such as ERROR_RDBMS
, ensure the table
exists. If the table exists, confirm that you have the correct
permissions on the bucket. For help configuring access control in
Cloud Storage, see
Create and Manage Access Control Lists.
What's next
- Learn how to check the status of import and export operations.
- Learn more about best practices for importing and exporting data.
- Known issues for imports and exports.