Restez organisé à l'aide des collections
Enregistrez et classez les contenus selon vos préférences.
Exporter les résultats de requêtes vers Amazon S3
Ce document explique comment exporter le résultat d'une requête exécutée sur une table BigLake vers votre bucket Amazon S3 (Amazon Simple Storage Service).
La stratégie IAM (Identity and Access Management) appropriée pour Amazon Web Services (AWS) :
Vous devez disposer de l'autorisation PutObject pour écrire des données dans le bucket Amazon S3.
Pour en savoir plus, consultez Créer une stratégie IAM AWS pour BigQuery.
BigQuery Omni écrit sur l'emplacement Amazon S3 spécifié, quel que soit le contenu existant. La requête d'exportation peut écraser les données existantes ou combiner le résultat de la requête avec des données existantes. Nous vous recommandons d'exporter le résultat de la requête vers un bucket Amazon S3 vide.
Pour exécuter une requête, sélectionnez l'une des options suivantes :
SQL
Dans le champ Éditeur de requête, saisissez une requête d'exportation en GoogleSQL.
GoogleSQL est la syntaxe par défaut dans la console Google Cloud .
Dans la console Google Cloud , accédez à la page BigQuery.
CONNECTION_REGION : région dans laquelle la connexion a été créée.
CONNECTION_NAME : nom de la connexion que vous avez créée avec l'autorisation nécessaire pour écrire dans le bucket Amazon S3.
BUCKET_NAME : bucket Amazon S3 dans lequel vous souhaitez écrire les données.
PATH : chemin vers lequel vous souhaitez écrire le fichier exporté. Il doit contenir exactement un caractère générique *, n'importe où dans le répertoire feuille de la chaîne du chemin d'accès, par exemple, ../aa/*, ../aa/b*c, ../aa/*bc et ../aa/bc*. BigQuery remplace * par 0000..N en fonction du nombre de fichiers exportés.
BigQuery détermine le nombre et la taille des fichiers. Si BigQuery décide d'exporter deux fichiers, alors * dans le nom du premier fichier est remplacé par 000000000000, et * dans le nom du deuxième fichier par 000000000001.
FORMAT : les formats acceptés sont JSON, AVRO, CSV et PARQUET.
QUERY : requête permettant d'analyser les données stockées dans une table BigLake. L'ensemble de données contenant la table BigLake utilisée dans la requête doit se trouver dans la même région Amazon S3 que le bucket Amazon S3 cible.
importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.QueryJobConfiguration;importcom.google.cloud.bigquery.TableResult;// Sample to export query results to Amazon S3 bucketpublicclassExportQueryResultsToS3{publicstaticvoidmain(String[]args)throwsInterruptedException{// TODO(developer): Replace these variables before running the sample.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";StringexternalTableName="MY_EXTERNAL_TABLE_NAME";// connectionName should be in the format of connection_region.connection_name. e.g.// aws-us-east-1.s3-write-connStringconnectionName="MY_CONNECTION_REGION.MY_CONNECTION_NAME";// destinationUri must contain exactly one * anywhere in the leaf directory of the path string// e.g. ../aa/*, ../aa/b*c, ../aa/*bc, and ../aa/bc*// BigQuery replaces * with 0000..N depending on the number of files exported.// BigQuery determines the file count and sizes.StringdestinationUri="s3://your-bucket-name/*";Stringformat="EXPORT_FORMAT";// Export result of query to find states starting with 'W'Stringquery=String.format("EXPORT DATA WITH CONNECTION `%s` OPTIONS(uri='%s', format='%s') "+"AS SELECT * FROM %s.%s.%s WHERE name LIKE 'W%%'",connectionName,destinationUri,format,projectId,datasetName,externalTableName);exportQueryResultsToS3(query);}publicstaticvoidexportQueryResultsToS3(Stringquery)throwsInterruptedException{try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();TableResultresults=bigquery.query(QueryJobConfiguration.of(query));results.iterateAll().forEach(row->row.forEach(val->System.out.printf("%s,",val.toString())));System.out.println("Query results exported to Amazon S3 successfully.");}catch(BigQueryExceptione){System.out.println("Query not performed \n"+e.toString());}}}
Dépannage
Si vous obtenez une erreur liée à un problème de quota (quota failure), vérifiez si vous disposez d'une capacité réservée pour vos requêtes. Pour en savoir plus sur les réservations d'emplacements, consultez la section Avant de commencer de ce document.
Sauf indication contraire, le contenu de cette page est régi par une licence Creative Commons Attribution 4.0, et les échantillons de code sont régis par une licence Apache 2.0. Pour en savoir plus, consultez les Règles du site Google Developers. Java est une marque déposée d'Oracle et/ou de ses sociétés affiliées.
Dernière mise à jour le 2025/09/04 (UTC).
[[["Facile à comprendre","easyToUnderstand","thumb-up"],["J'ai pu résoudre mon problème","solvedMyProblem","thumb-up"],["Autre","otherUp","thumb-up"]],[["Difficile à comprendre","hardToUnderstand","thumb-down"],["Informations ou exemple de code incorrects","incorrectInformationOrSampleCode","thumb-down"],["Il n'y a pas l'information/les exemples dont j'ai besoin","missingTheInformationSamplesINeed","thumb-down"],["Problème de traduction","translationIssue","thumb-down"],["Autre","otherDown","thumb-down"]],["Dernière mise à jour le 2025/09/04 (UTC)."],[[["\u003cp\u003eThis document guides users on exporting BigQuery query results from a BigLake table to an Amazon S3 bucket.\u003c/p\u003e\n"],["\u003cp\u003eBefore exporting, users must have an established connection to their Amazon S3 bucket, an Amazon S3 BigLake table, and the necessary AWS IAM \u003ccode\u003ePutObject\u003c/code\u003e permissions.\u003c/p\u003e\n"],["\u003cp\u003eThe export process utilizes an \u003ccode\u003eEXPORT DATA\u003c/code\u003e statement within a GoogleSQL query, specifying connection details, the S3 bucket URI, the desired file format (JSON, AVRO, CSV, or PARQUET), and the query itself.\u003c/p\u003e\n"],["\u003cp\u003eWhen exporting, it is recommended to use an empty Amazon S3 bucket, as existing data can be overwritten or mixed with the new query results.\u003c/p\u003e\n"],["\u003cp\u003eFor users on the capacity-based pricing model, the BigQuery Reservation API must be enabled, and if quota failure issues arise, reserved capacity for queries needs to be checked.\u003c/p\u003e\n"]]],[],null,["# Export query results to Amazon S3\n=================================\n\nThis document describes how to export the result of a query that runs against a\n[BigLake table](/bigquery/docs/biglake-intro) to your\nAmazon Simple Storage Service (Amazon S3) bucket.\n\nFor information about how data flows between BigQuery and\nAmazon S3,\nsee [Data flow when exporting data](/bigquery/docs/omni-introduction#export-data).\n\nLimitations\n-----------\n\nFor a full list of limitations that apply to BigLake tables\nbased on Amazon S3 and Blob Storage, see [Limitations](/bigquery/docs/omni-introduction#limitations).\n\nBefore you begin\n----------------\n\nEnsure that you have the following resources:\n\n\n- A [connection to access your Amazon S3 bucket](/bigquery/docs/omni-aws-create-connection).\n- An [Amazon S3 BigLake table](/bigquery/docs/omni-aws-create-external-table).\n- The correct Amazon Web Services (AWS) Identity and Access Management (IAM) policy:\n - You must have the `PutObject` permission to write data into the Amazon S3 bucket. For more information, see [Create an AWS IAM policy for BigQuery](/bigquery/docs/omni-aws-create-connection#creating-aws-iam-policy).\n\n\u003c!-- --\u003e\n\n- If you are on the [capacity-based pricing model](/bigquery/pricing#capacity_compute_analysis_pricing), then ensure that you have enabled the [BigQuery Reservation API](https://console.cloud.google.com/apis/library/bigqueryreservation.googleapis.com) for your project. For information about pricing, see [BigQuery Omni pricing](/bigquery/pricing#bqomni).\n\nExport query results\n--------------------\n\nBigQuery Omni writes to the specified\nAmazon S3 location regardless of any existing\ncontent. The export query can overwrite existing data or mix the query result\nwith existing data. We recommend that you export the query result to an empty\nAmazon S3 bucket.\n\nTo run a query, select one of the following options: \n\n### SQL\n\nIn the **Query editor** field, enter a GoogleSQL export query.\nGoogleSQL is the default syntax in the Google Cloud console.\n\n\u003cbr /\u003e\n\n\n| **Note:** To override the default project, use the `--project_id=`\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e parameter. Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of your Google Cloud project.\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the query editor, enter the following statement:\n\n ```googlesql\n EXPORT DATA WITH CONNECTION `\u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e.\u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e`\n OPTIONS(uri=\"s3://\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003ePATH\u003c/var\u003e\", format=\"\u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e\", ...)\n AS QUERY\n ```\n\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e: the region where the connection was created.\n - \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e: the connection name that you created with the necessary permission to write to the Amazon S3 bucket.\n - \u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e: the Amazon S3 bucket where you want to write the data.\n - \u003cvar translate=\"no\"\u003ePATH\u003c/var\u003e: the path where you want to write the exported file to. It must contain exactly one wildcard `*` anywhere in the leaf directory of the path string, for example, `../aa/*`, `../aa/b*c`, `../aa/*bc`, and `../aa/bc*`. BigQuery replaces `*` with `0000..N` depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then `*` in the first file's filename is replaced by `000000000000`, and `*` in the second file's filename is replaced by `000000000001`.\n - \u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e: supported formats are `JSON`, `AVRO`, `CSV`, and `PARQUET`.\n - \u003cvar translate=\"no\"\u003eQUERY\u003c/var\u003e: the query to analyze the data that is stored in a BigLake table. The dataset that contains the BigLake table used in the query must be located in the same [Amazon S3 region](/bigquery/docs/omni-introduction#locations) as the target Amazon S3 bucket.\n\n \u003cbr /\u003e\n\n - Click play_circle **Run**.\n\n \u003cbr /\u003e\n\nFor more information about how to run queries, see [Run an interactive query](/bigquery/docs/running-queries#queries).\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQuery.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryException.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryOptions.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.QueryJobConfiguration.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.TableResult.html;\n\n // Sample to export query results to Amazon S3 bucket\n public class ExportQueryResultsToS3 {\n\n public static void main(String[] args) throws InterruptedException {\n // TODO(developer): Replace these variables before running the sample.\n String projectId = \"MY_PROJECT_ID\";\n String datasetName = \"MY_DATASET_NAME\";\n String externalTableName = \"MY_EXTERNAL_TABLE_NAME\";\n // connectionName should be in the format of connection_region.connection_name. e.g.\n // aws-us-east-1.s3-write-conn\n String connectionName = \"MY_CONNECTION_REGION.MY_CONNECTION_NAME\";\n // destinationUri must contain exactly one * anywhere in the leaf directory of the path string\n // e.g. ../aa/*, ../aa/b*c, ../aa/*bc, and ../aa/bc*\n // BigQuery replaces * with 0000..N depending on the number of files exported.\n // BigQuery determines the file count and sizes.\n String destinationUri = \"s3://your-bucket-name/*\";\n String format = \"EXPORT_FORMAT\";\n // Export result of query to find states starting with 'W'\n String query =\n String.format(\n \"EXPORT DATA WITH CONNECTION `%s` OPTIONS(uri='%s', format='%s') \"\n + \"AS SELECT * FROM %s.%s.%s WHERE name LIKE 'W%%'\",\n connectionName, destinationUri, format, projectId, datasetName, externalTableName);\n exportQueryResultsToS3(query);\n }\n\n public static void exportQueryResultsToS3(String query) throws InterruptedException {\n try {\n // Initialize client that will be used to send requests. This client only needs to be created\n // once, and can be reused for multiple requests.\n https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQuery.html bigquery = https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryOptions.html.getDefaultInstance().getService();\n\n https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.TableResult.html results = bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQuery.html#com_google_cloud_bigquery_BigQuery_query_com_google_cloud_bigquery_QueryJobConfiguration_com_google_cloud_bigquery_BigQuery_JobOption____(https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.QueryJobConfiguration.html.of(query));\n\n results\n .https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.TableResult.html#com_google_cloud_bigquery_TableResult_iterateAll__()\n .forEach(row -\u003e row.forEach(val -\u003e System.out.printf(\"%s,\", val.toString())));\n\n System.out.println(\"Query results exported to Amazon S3 successfully.\");\n } catch (https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryException.html e) {\n System.out.println(\"Query not performed \\n\" + e.toString());\n }\n }\n }\n\nTroubleshooting\n---------------\n\nIf you get an error related to `quota failure`, then check if you have reserved\ncapacity for your queries. For more information about slot reservations, see\n[Before you begin](#before_you_begin) in this document.\n\nWhat's next\n-----------\n\n- Learn about [BigQuery Omni](/bigquery/docs/omni-introduction).\n- Learn how to [export table data](/bigquery/docs/exporting-data).\n- Learn how to [query data stored in Amazon S3](/bigquery/docs/query-aws-data).\n- Learn how to [set up VPC Service Controls for BigQuery Omni](/bigquery/docs/omni-vpc-sc)."]]