リードレプリカからエクスポートを取得します。エクスポートを頻繁に(毎日またはそれを超える頻度で)行う場合については、エクスポートされるデータ量が少なければ、これが適切な選択肢になります。リードレプリカからエクスポートするには、 Google Cloud コンソール、gcloud、または REST API を使用して、リードレプリカ インスタンスに対してエクスポート機能を実行します。リードレプリカの作成と管理の方法については、リードレプリカの作成をご覧ください。
Cloud SQL にはデータベースのエクスポートを自動化するための組み込み機能はありませんが、いくつかの Google Cloudコンポーネントを使用して、独自の自動化ツールを構築できます。詳細については、こちらのチュートリアルをご覧ください。
トラブルシューティング
インポート オペレーションのトラブルシューティング
問題
トラブルシューティング
エラー メッセージ: permission denied for schema public
PostgreSQL バージョン 15 以降では、ターゲット データベースが template0 から作成されている場合、データのインポートが失敗することがあります。この問題を解決するには、GRANT ALL ON SCHEMA public TO cloudsqlsuperuser SQL コマンドを実行して、cloudsqlsuperuser ユーザーに公開スキーマ権限を付与します。
HTTP Error 409: Operation failed because another operation was already in progress
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-09-04 UTC。"],[],[],null,["# Best practices for importing and exporting data\n\n\u003cbr /\u003e\n\n[MySQL](/sql/docs/mysql/import-export \"View this page for the MySQL database engine\") \\| PostgreSQL \\| [SQL Server](/sql/docs/sqlserver/import-export \"View this page for the SQL Server database engine\")\n\n\u003cbr /\u003e\n\nThis page provides best practices for importing and exporting data with Cloud SQL. For step-by-step instructions for importing data into Cloud SQL, see [Importing Data](/sql/docs/postgres/import-export/import-export-dmp). For step-by-step instructions for exporting your data, whether it is in Cloud SQL or an instance you manage, see [Exporting\nData](/sql/docs/postgres/import-export/import-export-dmp).\n\n\u003cbr /\u003e\n\n| **Note:** If you are migrating an entire database from a supported database server (on-premises, in AWS or Google Cloud) to a new Cloud SQL instance, you can use the [Database Migration Service](/database-migration/docs) instead of exporting and then importing files.\n\nBest practices for importing and exporting\n------------------------------------------\n\nThe following are best practices to consider when importing and\nexporting data:\n\n- [Don't use Cloud Storage Requester Pays buckets](#restrictions)\n- [Minimize the performance impact of exports](#serverless)\n- [Use the correct flags when you create a SQL dump file](#sqldump-flags)\n- [Compress data to reduce cost](#data-compression).\n- [Reduce long-running import and export processes](#long_running)\n- [Verify the imported database](#verify)\n\n### Don't use Cloud Storage Requester Pays buckets\n\nYou cannot use a Cloud Storage bucket that has\n[Requester Pays](/storage/docs/requester-pays) enabled for imports and exports\nfrom Cloud SQL.\n\n\u003cbr /\u003e\n\n### Minimize the performance impact of exports\n\nFor a standard export from Cloud SQL, the export is run while the database\nis online. When the data being exported is smaller, the impact is likely to\nbe minimal. However, when there are large databases, or large objects, such as BLOBs in the\ndatabase, there's the possibility that the export might degrade database\nperformance. This might impact the time it takes to perform database queries\nand operations against the database. After you start an export, it's not\npossible to stop it if your database starts to respond slowly.\n\nTo prevent slow responses during an export, you can:\n\n1. Take the export from a read replica. This might be a good option if you\n take exports frequently (daily or more often), but the amount of data being\n exported is small. To perform an export from a read replica, use the\n Google Cloud Console, `gcloud`, or REST API export functions on your read replica\n instance. See [Create read replicas](/sql/docs/postgres/replication/create-replica) for more information about how to create and manage\n read replicas.\n\n2. Use serverless export. With serverless export, Cloud SQL creates\n a separate, temporary instance to offload the export operation. Offloading\n the export operation allows databases on the primary instance to continue to\n serve queries and perform operations at the usual performance rate. When the\n data export is complete, the temporary instance is deleted automatically.\n This might be a good option if you're taking a one-time export of a large\n database. Use the Google Cloud Console, `gcloud`, or REST\n API [export functions](/sql/docs/postgres/import-export/import-export-dmp), with the `offload` flag, to perform a\n serverless export operation.\n\n During a serverless export operation you can run some other operations, such as\n instance edit, import, and failover. However, if you select `delete`,\n the export operation stops some time after you delete the instance, and it\n doesn't export any data.\n See the following table to learn about the operations that can be blocked while a serverless export operation is running:\n\n A serverless export takes longer to do than a standard export, because it takes\n time to create the temporary instance. At a minimum, it takes longer than five\n minutes, but for larger databases, it might be longer. Consider the impact to\n time, performance, and cost before determining which type of export to use.\n\n| **Note:** Serverless export costs extra. See the [pricing page](/sql/docs/postgres/pricing#serverless).\n\n\u003cbr /\u003e\n\n### Use the correct flags when you create a SQL dump file\n\nIf you do not use the right procedure when you export data to a SQL dump\nfile, your import might be unsuccessful. For information about creating a\nSQL dump file for import into Cloud SQL, see\n[Exporting data](/sql/docs/postgres/import-export/import-export-dmp).\n\n### Compress data to reduce cost\n\nCloud SQL supports importing and exporting both compressed and\nuncompressed files. Compression can save significant storage space on\nCloud Storage and reduce your storage costs, especially when you are\nexporting large instances.\n| **Note:** Compression can degrade export performance.\nWhen you export a SQL dump or CSV file, use a `.gz` file extension to compress the data. When you import a file with an extension of `.gz`, it is decompressed automatically.\n\n\u003cbr /\u003e\n\n### Reduce long-running import and export processes\n\nImports into Cloud SQL and exports out of Cloud SQL can take a long time to complete,\ndepending on the size of the data being processed. This can have the following impacts:\n\n- You can't stop a long-running Cloud SQL instance operation.\n- You can perform only one import or export operation at a time for each instance, and a long-running import or export blocks other operations, such as daily automated backups. Serverless exports allow you to run other operations, including editing instances, import, failover, and unblocking daily automated backups.\n\nYou can decrease the amount of time it takes to complete each operation by using the\nCloud SQL import or export functionality with smaller batches of data.\n\n\nFor exports, you can perform the export from a [read replica](/sql/docs/postgres/replication/create-replica) or use\n[serverless export](/sql/docs/postgres/import-export#serverless) to\nminimize the impact on database performance and allow other operations to run on your instance\nwhile an export is running.\n| **Note:** Serverless export costs extra. See the [pricing page](/sql/docs/postgres/pricing#export-offload).\nFor more tips, see [Diagnosing\nIssues with Cloud SQL Instances](/sql/docs/postgres/diagnose-issues#import-export).\n\n### Verify the imported database\n\nAfter an import operation is complete, connect to your database and run the\nappropriate database commands to make sure the contents are correct. For\nexample, [connect](/sql/docs/postgres/quickstart#connect) and\nlist the databases, tables, and specific entries.\n\nKnown limitations\n-----------------\n\nFor a list of known limitations, see\n[Issues with importing and exporting data](/sql/docs/postgres/known-issues#import-export).\n\nAutomating export operations\n----------------------------\n\n\nAlthough Cloud SQL doesn't provide a built-in way to automate database\nexports, you can build your own automation tool using several Google Cloud\ncomponents. To learn more, see\n[this tutorial](/architecture/scheduling-cloud-sql-database-exports-using-cloud-scheduler).\n\nTroubleshooting\n---------------\n\n### Troubleshooting import operations\n\n### Troubleshooting export operations\n\nWhat's next\n-----------\n\n- [Learn how to import and export data using PG dump files](/sql/docs/postgres/import-export/import-export-dmp).\n- [Learn how to import and export data using CSV files](/sql/docs/postgres/import-export/import-export-csv).\n- [Learn how to enable automatic backups](/sql/docs/postgres/backup-recovery/backups).\n- [Learn how to restore from backups](/sql/docs/postgres/backup-recovery/restore)."]]