Best practices for importing and exporting
The following are best practices to consider when importing and exporting data:
- Don't use Cloud Storage Requester Pays buckets
- Compress data to reduce cost.
- Reduce long-running import and export processes
- Use the bcp utility for importing and exporting data
- Use SqlPackage for importing and exporting data
- Use striped import and export
- Verify the imported database
Don't use Cloud Storage Requester Pays buckets
You cannot use a Cloud Storage bucket that has Requester Pays enabled for imports and exports from Cloud SQL.
Compress data to reduce cost
Cloud SQL supports importing and exporting both compressed and uncompressed files. Compression can save significant storage space on Cloud Storage and reduce your storage costs, especially when you are exporting large instances.
When you export a BAK file, use a .gz
file extension to compress the data. When you import a file with an
extension of .gz
, it is decompressed automatically.
Reduce long-running import and export processes
Imports into Cloud SQL and exports out of Cloud SQL can take a long time to complete, depending on the size of the data being processed. This can have the following impacts:
- You can't stop a long-running Cloud SQL instance operation.
- You can perform only one import or export operation at a time for each instance, and a long-running import or export blocks other operations, such as daily automated backups.
You can decrease the amount of time it takes to complete each operation by using the Cloud SQL import or export functionality with smaller batches of data.
For whole database migrations, you generally should use BAK files rather than SQL files for imports. Generally, importing from a SQL file takes much longer than importing from a BAK file.
Use SqlPackage for importing and exporting data
You can import and export data in Cloud SQL by using SqlPackage. It enables you to export a SQL database, including database schema and user data, to a BACPAC file (.bacpac) and to import the schema and table data from a BACPAC file into a new user database.
SqlPackage uses your credentials to connect to SQL Server to perform database imports and exports. It makes migrations available for all Cloud SQL users. To perform import and export operations, you must have the following:
A workstation that is connected to your instance, where you can run SqlPackage. To learn more about connectivity options, see About connection options.
SqlPackage installed on your system. To learn more about downloading and installing SqlPackage, see the Microsoft documentation.
Credentials set up to access your instance. To learn more about setting up credentials, see How to authenticate to Cloud SQL.
Examples
Import
To import data to a database AdventureWorks2017
, run the following command:
c:\Program Files\Microsoft SQL Server\160\DAC\bin>SqlPackage /Action:Import /tsn:myTargetServer /tdn:AdventureWorks2017 /tu:myUsername /sf:mySourceFile /TargetTrustServerCertificate:True /tp:myPassword
Here,
mySourceFile
is a source file that you want to use as the source of action from local storage. If you use this parameter, no other source parameter is valid.myTargetServer
is the name of the server hosting the target database.myUsername
is the SQL Server username that you want to use to access the target database.myPassword
is your password in the credentials.
To learn more, see the Microsoft documentation.
Export
To export data from a database AdventureWorks2017
, run the following command:
c:\Program Files\Microsoft SQL Server\160\DAC\bin>SqlPackage /Action:Export /TargetFile:"myTargetFile" /ssn:mySourceServer /su:myUsername /sdn:AdventureWorks2017 /SourceTrustServerCertificate:True /sp:myPassword
Here,
myTargetFile
is the target file (a .dacpac file) that you want to use as the target of action instead of a database. If you use this parameter, no other target parameter is valid. This parameter is invalid for actions that only support database targets.myUsername
is the SQL Server username that you want to use to access the source database.mySourceServer
is the name of the server hosting the source database.myPassword
is your password in the credentials.
To learn more, see the Microsoft documentation.
Use the bcp utility for importing and exporting data
Another option to import and export data in Cloud SQL is using the bulk copy program (bcp) utility. By using the bcp utility, you can export data from a SQL Server database into a data file and import data from a data file into a SQL Server database. The bcp utility uses your credentials to connect to SQL Server to perform database imports and exports. It makes transfers available for all Cloud SQL users. To perform import and export operations, you must have the following:
A workstation where you can run the bcp utility, and that has connectivity to your Cloud SQL instance. To learn more about connectivity options, see About connection options.
The bcp utility installed on your system. To learn more about downloading and installing bcp, see the Microsoft documentation.
Credentials set up to access your instance. To learn more about setting up credentials, see How to authenticate to Cloud SQL.
Examples
Import
To import data from the person.csv
file to the Person
table of the AdventureWorks2017
database, run the following command:
bcp Person.Person in "person.csv" -d AdventureWorks2017 -U myLoginID -S myServer
Here,
myLoginID
is the login ID used to connect to SQL Server.myServer
is the instance of SQL Server to which you want to connect. If you don't specify a server, the bcp utility connects to the default instance of SQL Server on the local computer.
To learn more, see the Microsoft documentation.
Export
To export data from the Person
table of the AdventureWorks2017
database to the
person.dat
file, run the following command:
bcp Person.Person out "person.dat" -U myLoginID -S myServer -d AdventureWorks2017
Here,
myLoginID
is the login ID used to connect to SQL Server.myServer
is the instance of SQL Server to which you want to connect. If you don't specify a server, the bcp utility connects to the default instance of SQL Server on the local computer.
To learn more, see the Microsoft documentation.
Use striped import and export
When you perform a striped import or export, you reduce the time it takes for the operation to complete, and enable databases larger than 5 TB to be imported and exported. For more information, see Export and import using BAK files.
Verify the imported database
After an import operation is complete, connect to your database and run the appropriate database commands to make sure the contents are correct. For example, connect and list the databases, tables, and specific entries.
Known limitations
For a list of known limitations, see Issues with importing and exporting data.
Automating export operations
Although Cloud SQL doesn't provide a built-in way to automate database exports, you can build your own automation tool using several Google Cloud components. To learn more, see this tutorial.
Troubleshooting
Troubleshooting import operations
Issue | Troubleshooting |
---|---|
HTTP Error 409: Operation failed because another operation was already in progress . |
There is already a pending operation for your instance. Only one operation is allowed at a time. Try your request after the current operation is complete. |
The import operation is taking too long. | Too many active connections can interfere with import operations.
Close unused operations. Check the CPU and memory usage of your Cloud SQL instance to make sure there are plenty of resources available. The best way to ensure maximum resources for the import is to restart the instance before beginning the operation. A restart:
|
An import operation can fail when one or more users referenced in the dump file don't exist. | Before importing a dump file, all the database users who own objects or
were granted permissions on objects in the dumped database must exist in the
target database. If they don't, the import operation fails to recreate the
objects with the original ownership or permissions.
Create the database users before importing. |
Troubleshooting export operations
Issue | Troubleshooting |
---|---|
HTTP Error 409: Operation failed because another operation was
already in progress. |
There is already a pending operation for your instance. Only one operation is allowed at a time. Try your request after the current operation is complete. |
HTTP Error 403: The service account does not have the required
permissions for the bucket. |
Ensure that the bucket exists and the service account for the Cloud SQL
instance (which is performing the export) has the
Storage Object Creator role
(roles/storage.objectCreator ) to allow export to the bucket. See
IAM roles for Cloud Storage. |
You want exports to be automated. | Cloud SQL does not provide a way to automate exports.
You could build your own automated export system using Google Cloud products such as Cloud Scheduler, Pub/Sub, and Cloud Functions, similar to this article on automating backups. |
What's next
- Learn how to import and export data using BAK files.
- Learn how to import data using SQL dump files.
- Learn how to enable automatic backups.
- Learn how to restore from backups.