尽管 Cloud SQL 没有提供自动导出数据库的内置方法,但是您可以使用多个 Google Cloud组件构建自己的自动化工具。如需了解详情,请参阅此教程。
问题排查
排查导入操作问题
问题
问题排查
错误消息:permission denied for schema public
对于 PostgreSQL 版本 15 及更高版本,如果目标数据库是通过 template0 创建,则导入数据可能会失败。如需解决此问题,请运行 GRANT ALL ON SCHEMA public TO cloudsqlsuperuser SQL 命令向 cloudsqlsuperuser 用户提供公共架构特权。
HTTP Error 409: Operation failed because another operation was already in progress。
您的实例已有一项待处理的操作。一次只能执行一项操作。在当前操作完成后尝试您的请求。
导入操作花费的时间太长。
过多的有效连接可能会干扰导入操作。
关闭未使用的操作。检查 Cloud SQL 实例的 CPU 和内存用量,以确保有大量的可用资源。确保将最多资源用于导入操作的最佳方法是在开始执行操作之前重启实例。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[],[],null,["# Best practices for importing and exporting data\n\n\u003cbr /\u003e\n\n[MySQL](/sql/docs/mysql/import-export \"View this page for the MySQL database engine\") \\| PostgreSQL \\| [SQL Server](/sql/docs/sqlserver/import-export \"View this page for the SQL Server database engine\")\n\n\u003cbr /\u003e\n\nThis page provides best practices for importing and exporting data with Cloud SQL. For step-by-step instructions for importing data into Cloud SQL, see [Importing Data](/sql/docs/postgres/import-export/import-export-dmp). For step-by-step instructions for exporting your data, whether it is in Cloud SQL or an instance you manage, see [Exporting\nData](/sql/docs/postgres/import-export/import-export-dmp).\n\n\u003cbr /\u003e\n\n| **Note:** If you are migrating an entire database from a supported database server (on-premises, in AWS or Google Cloud) to a new Cloud SQL instance, you can use the [Database Migration Service](/database-migration/docs) instead of exporting and then importing files.\n\nBest practices for importing and exporting\n------------------------------------------\n\nThe following are best practices to consider when importing and\nexporting data:\n\n- [Don't use Cloud Storage Requester Pays buckets](#restrictions)\n- [Minimize the performance impact of exports](#serverless)\n- [Use the correct flags when you create a SQL dump file](#sqldump-flags)\n- [Compress data to reduce cost](#data-compression).\n- [Reduce long-running import and export processes](#long_running)\n- [Verify the imported database](#verify)\n\n### Don't use Cloud Storage Requester Pays buckets\n\nYou cannot use a Cloud Storage bucket that has\n[Requester Pays](/storage/docs/requester-pays) enabled for imports and exports\nfrom Cloud SQL.\n\n\u003cbr /\u003e\n\n### Minimize the performance impact of exports\n\nFor a standard export from Cloud SQL, the export is run while the database\nis online. When the data being exported is smaller, the impact is likely to\nbe minimal. However, when there are large databases, or large objects, such as BLOBs in the\ndatabase, there's the possibility that the export might degrade database\nperformance. This might impact the time it takes to perform database queries\nand operations against the database. After you start an export, it's not\npossible to stop it if your database starts to respond slowly.\n\nTo prevent slow responses during an export, you can:\n\n1. Take the export from a read replica. This might be a good option if you\n take exports frequently (daily or more often), but the amount of data being\n exported is small. To perform an export from a read replica, use the\n Google Cloud Console, `gcloud`, or REST API export functions on your read replica\n instance. See [Create read replicas](/sql/docs/postgres/replication/create-replica) for more information about how to create and manage\n read replicas.\n\n2. Use serverless export. With serverless export, Cloud SQL creates\n a separate, temporary instance to offload the export operation. Offloading\n the export operation allows databases on the primary instance to continue to\n serve queries and perform operations at the usual performance rate. When the\n data export is complete, the temporary instance is deleted automatically.\n This might be a good option if you're taking a one-time export of a large\n database. Use the Google Cloud Console, `gcloud`, or REST\n API [export functions](/sql/docs/postgres/import-export/import-export-dmp), with the `offload` flag, to perform a\n serverless export operation.\n\n During a serverless export operation you can run some other operations, such as\n instance edit, import, and failover. However, if you select `delete`,\n the export operation stops some time after you delete the instance, and it\n doesn't export any data.\n See the following table to learn about the operations that can be blocked while a serverless export operation is running:\n\n A serverless export takes longer to do than a standard export, because it takes\n time to create the temporary instance. At a minimum, it takes longer than five\n minutes, but for larger databases, it might be longer. Consider the impact to\n time, performance, and cost before determining which type of export to use.\n\n| **Note:** Serverless export costs extra. See the [pricing page](/sql/docs/postgres/pricing#serverless).\n\n\u003cbr /\u003e\n\n### Use the correct flags when you create a SQL dump file\n\nIf you do not use the right procedure when you export data to a SQL dump\nfile, your import might be unsuccessful. For information about creating a\nSQL dump file for import into Cloud SQL, see\n[Exporting data](/sql/docs/postgres/import-export/import-export-dmp).\n\n### Compress data to reduce cost\n\nCloud SQL supports importing and exporting both compressed and\nuncompressed files. Compression can save significant storage space on\nCloud Storage and reduce your storage costs, especially when you are\nexporting large instances.\n| **Note:** Compression can degrade export performance.\nWhen you export a SQL dump or CSV file, use a `.gz` file extension to compress the data. When you import a file with an extension of `.gz`, it is decompressed automatically.\n\n\u003cbr /\u003e\n\n### Reduce long-running import and export processes\n\nImports into Cloud SQL and exports out of Cloud SQL can take a long time to complete,\ndepending on the size of the data being processed. This can have the following impacts:\n\n- You can't stop a long-running Cloud SQL instance operation.\n- You can perform only one import or export operation at a time for each instance, and a long-running import or export blocks other operations, such as daily automated backups. Serverless exports allow you to run other operations, including editing instances, import, failover, and unblocking daily automated backups.\n\nYou can decrease the amount of time it takes to complete each operation by using the\nCloud SQL import or export functionality with smaller batches of data.\n\n\nFor exports, you can perform the export from a [read replica](/sql/docs/postgres/replication/create-replica) or use\n[serverless export](/sql/docs/postgres/import-export#serverless) to\nminimize the impact on database performance and allow other operations to run on your instance\nwhile an export is running.\n| **Note:** Serverless export costs extra. See the [pricing page](/sql/docs/postgres/pricing#export-offload).\nFor more tips, see [Diagnosing\nIssues with Cloud SQL Instances](/sql/docs/postgres/diagnose-issues#import-export).\n\n### Verify the imported database\n\nAfter an import operation is complete, connect to your database and run the\nappropriate database commands to make sure the contents are correct. For\nexample, [connect](/sql/docs/postgres/quickstart#connect) and\nlist the databases, tables, and specific entries.\n\nKnown limitations\n-----------------\n\nFor a list of known limitations, see\n[Issues with importing and exporting data](/sql/docs/postgres/known-issues#import-export).\n\nAutomating export operations\n----------------------------\n\n\nAlthough Cloud SQL doesn't provide a built-in way to automate database\nexports, you can build your own automation tool using several Google Cloud\ncomponents. To learn more, see\n[this tutorial](/architecture/scheduling-cloud-sql-database-exports-using-cloud-scheduler).\n\nTroubleshooting\n---------------\n\n### Troubleshooting import operations\n\n### Troubleshooting export operations\n\nWhat's next\n-----------\n\n- [Learn how to import and export data using PG dump files](/sql/docs/postgres/import-export/import-export-dmp).\n- [Learn how to import and export data using CSV files](/sql/docs/postgres/import-export/import-export-csv).\n- [Learn how to enable automatic backups](/sql/docs/postgres/backup-recovery/backups).\n- [Learn how to restore from backups](/sql/docs/postgres/backup-recovery/restore)."]]