Organiza tus páginas con colecciones
Guarda y categoriza el contenido según tus preferencias.
Exporta los resultados de las consultas a Amazon S3
En este documento, se describe cómo exportar el resultado de una consulta que se ejecuta en una tabla de BigLake en el bucket de Amazon Simple Storage Service (Amazon S3).
BigQuery Omni escribe en la ubicación de Amazon S3 especificada, sin importar el contenido existente. La consulta de exportación puede reemplazar los datos existentes o combinar el resultado de la consulta con los datos existentes. Recomendamos que exportes el resultado de la consulta a un bucket vacío de Amazon S3.
Para ejecutar una consulta, selecciona una de las siguientes opciones:
SQL
En el campo Editor de consultas, ingresa una consulta de exportación de GoogleSQL.
GoogleSQL es la sintaxis predeterminada en la Google Cloud consola.
En la consola de Google Cloud , ve a la página BigQuery.
CONNECTION_REGION: la región en la que se creó la conexión.
CONNECTION_NAME: el nombre de conexión que creaste con el permiso necesario para escribir en el bucket de Amazon S3.
BUCKET_NAME: el bucket de Amazon S3 en el que deseas escribir los datos.
PATH: la ruta en la que deseas escribir el archivo exportado. Debe contener exactamente un comodín * en cualquier lugar del directorio hoja de la string de ruta de acceso, por ejemplo, ../aa/*, ../aa/b*c, ../aa/*bc y ../aa/bc*. BigQuery reemplaza * por 0000..N, según la cantidad de archivos exportados.
BigQuery determina el recuento y los tamaños de archivos. Si BigQuery decide exportar dos archivos, entonces se reemplaza * en el nombre de archivo del primer archivo por 000000000000 y * en el segundo nombre del archivo, se reemplaza por 000000000001.
FORMAT: Los formatos compatibles son JSON, AVRO, CSV y PARQUET.
QUERY: la consulta para analizar los datos que se almacenan en una tabla de BigLake. El conjunto de datos que contiene la tabla de BigLake que se usa en la consulta debe estar ubicado en la misma región de Amazon S3 que el bucket de Amazon S3 de destino.
importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.QueryJobConfiguration;importcom.google.cloud.bigquery.TableResult;// Sample to export query results to Amazon S3 bucketpublicclassExportQueryResultsToS3{publicstaticvoidmain(String[]args)throwsInterruptedException{// TODO(developer): Replace these variables before running the sample.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";StringexternalTableName="MY_EXTERNAL_TABLE_NAME";// connectionName should be in the format of connection_region.connection_name. e.g.// aws-us-east-1.s3-write-connStringconnectionName="MY_CONNECTION_REGION.MY_CONNECTION_NAME";// destinationUri must contain exactly one * anywhere in the leaf directory of the path string// e.g. ../aa/*, ../aa/b*c, ../aa/*bc, and ../aa/bc*// BigQuery replaces * with 0000..N depending on the number of files exported.// BigQuery determines the file count and sizes.StringdestinationUri="s3://your-bucket-name/*";Stringformat="EXPORT_FORMAT";// Export result of query to find states starting with 'W'Stringquery=String.format("EXPORT DATA WITH CONNECTION `%s` OPTIONS(uri='%s', format='%s') "+"AS SELECT * FROM %s.%s.%s WHERE name LIKE 'W%%'",connectionName,destinationUri,format,projectId,datasetName,externalTableName);exportQueryResultsToS3(query);}publicstaticvoidexportQueryResultsToS3(Stringquery)throwsInterruptedException{try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();TableResultresults=bigquery.query(QueryJobConfiguration.of(query));results.iterateAll().forEach(row->row.forEach(val->System.out.printf("%s,",val.toString())));System.out.println("Query results exported to Amazon S3 successfully.");}catch(BigQueryExceptione){System.out.println("Query not performed \n"+e.toString());}}}
Soluciona problemas
Si obtienes un error relacionado con quota failure, verifica si reservaste capacidad para tus consultas. Para obtener más información sobre las reservas de ranuras, consulta Antes de comenzar en este documento.
[[["Fácil de comprender","easyToUnderstand","thumb-up"],["Resolvió mi problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Información o código de muestra incorrectos","incorrectInformationOrSampleCode","thumb-down"],["Faltan la información o los ejemplos que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],["Última actualización: 2025-09-04 (UTC)"],[[["\u003cp\u003eThis document guides users on exporting BigQuery query results from a BigLake table to an Amazon S3 bucket.\u003c/p\u003e\n"],["\u003cp\u003eBefore exporting, users must have an established connection to their Amazon S3 bucket, an Amazon S3 BigLake table, and the necessary AWS IAM \u003ccode\u003ePutObject\u003c/code\u003e permissions.\u003c/p\u003e\n"],["\u003cp\u003eThe export process utilizes an \u003ccode\u003eEXPORT DATA\u003c/code\u003e statement within a GoogleSQL query, specifying connection details, the S3 bucket URI, the desired file format (JSON, AVRO, CSV, or PARQUET), and the query itself.\u003c/p\u003e\n"],["\u003cp\u003eWhen exporting, it is recommended to use an empty Amazon S3 bucket, as existing data can be overwritten or mixed with the new query results.\u003c/p\u003e\n"],["\u003cp\u003eFor users on the capacity-based pricing model, the BigQuery Reservation API must be enabled, and if quota failure issues arise, reserved capacity for queries needs to be checked.\u003c/p\u003e\n"]]],[],null,["# Export query results to Amazon S3\n=================================\n\nThis document describes how to export the result of a query that runs against a\n[BigLake table](/bigquery/docs/biglake-intro) to your\nAmazon Simple Storage Service (Amazon S3) bucket.\n\nFor information about how data flows between BigQuery and\nAmazon S3,\nsee [Data flow when exporting data](/bigquery/docs/omni-introduction#export-data).\n\nLimitations\n-----------\n\nFor a full list of limitations that apply to BigLake tables\nbased on Amazon S3 and Blob Storage, see [Limitations](/bigquery/docs/omni-introduction#limitations).\n\nBefore you begin\n----------------\n\nEnsure that you have the following resources:\n\n\n- A [connection to access your Amazon S3 bucket](/bigquery/docs/omni-aws-create-connection).\n- An [Amazon S3 BigLake table](/bigquery/docs/omni-aws-create-external-table).\n- The correct Amazon Web Services (AWS) Identity and Access Management (IAM) policy:\n - You must have the `PutObject` permission to write data into the Amazon S3 bucket. For more information, see [Create an AWS IAM policy for BigQuery](/bigquery/docs/omni-aws-create-connection#creating-aws-iam-policy).\n\n\u003c!-- --\u003e\n\n- If you are on the [capacity-based pricing model](/bigquery/pricing#capacity_compute_analysis_pricing), then ensure that you have enabled the [BigQuery Reservation API](https://console.cloud.google.com/apis/library/bigqueryreservation.googleapis.com) for your project. For information about pricing, see [BigQuery Omni pricing](/bigquery/pricing#bqomni).\n\nExport query results\n--------------------\n\nBigQuery Omni writes to the specified\nAmazon S3 location regardless of any existing\ncontent. The export query can overwrite existing data or mix the query result\nwith existing data. We recommend that you export the query result to an empty\nAmazon S3 bucket.\n\nTo run a query, select one of the following options: \n\n### SQL\n\nIn the **Query editor** field, enter a GoogleSQL export query.\nGoogleSQL is the default syntax in the Google Cloud console.\n\n\u003cbr /\u003e\n\n\n| **Note:** To override the default project, use the `--project_id=`\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e parameter. Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with the ID of your Google Cloud project.\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the query editor, enter the following statement:\n\n ```googlesql\n EXPORT DATA WITH CONNECTION `\u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e.\u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e`\n OPTIONS(uri=\"s3://\u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e/\u003cvar translate=\"no\"\u003ePATH\u003c/var\u003e\", format=\"\u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e\", ...)\n AS QUERY\n ```\n\n\n Replace the following:\n - \u003cvar translate=\"no\"\u003eCONNECTION_REGION\u003c/var\u003e: the region where the connection was created.\n - \u003cvar translate=\"no\"\u003eCONNECTION_NAME\u003c/var\u003e: the connection name that you created with the necessary permission to write to the Amazon S3 bucket.\n - \u003cvar translate=\"no\"\u003eBUCKET_NAME\u003c/var\u003e: the Amazon S3 bucket where you want to write the data.\n - \u003cvar translate=\"no\"\u003ePATH\u003c/var\u003e: the path where you want to write the exported file to. It must contain exactly one wildcard `*` anywhere in the leaf directory of the path string, for example, `../aa/*`, `../aa/b*c`, `../aa/*bc`, and `../aa/bc*`. BigQuery replaces `*` with `0000..N` depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then `*` in the first file's filename is replaced by `000000000000`, and `*` in the second file's filename is replaced by `000000000001`.\n - \u003cvar translate=\"no\"\u003eFORMAT\u003c/var\u003e: supported formats are `JSON`, `AVRO`, `CSV`, and `PARQUET`.\n - \u003cvar translate=\"no\"\u003eQUERY\u003c/var\u003e: the query to analyze the data that is stored in a BigLake table. The dataset that contains the BigLake table used in the query must be located in the same [Amazon S3 region](/bigquery/docs/omni-introduction#locations) as the target Amazon S3 bucket.\n\n \u003cbr /\u003e\n\n - Click play_circle **Run**.\n\n \u003cbr /\u003e\n\nFor more information about how to run queries, see [Run an interactive query](/bigquery/docs/running-queries#queries).\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQuery.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryException.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryOptions.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.QueryJobConfiguration.html;\n import com.google.cloud.bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.TableResult.html;\n\n // Sample to export query results to Amazon S3 bucket\n public class ExportQueryResultsToS3 {\n\n public static void main(String[] args) throws InterruptedException {\n // TODO(developer): Replace these variables before running the sample.\n String projectId = \"MY_PROJECT_ID\";\n String datasetName = \"MY_DATASET_NAME\";\n String externalTableName = \"MY_EXTERNAL_TABLE_NAME\";\n // connectionName should be in the format of connection_region.connection_name. e.g.\n // aws-us-east-1.s3-write-conn\n String connectionName = \"MY_CONNECTION_REGION.MY_CONNECTION_NAME\";\n // destinationUri must contain exactly one * anywhere in the leaf directory of the path string\n // e.g. ../aa/*, ../aa/b*c, ../aa/*bc, and ../aa/bc*\n // BigQuery replaces * with 0000..N depending on the number of files exported.\n // BigQuery determines the file count and sizes.\n String destinationUri = \"s3://your-bucket-name/*\";\n String format = \"EXPORT_FORMAT\";\n // Export result of query to find states starting with 'W'\n String query =\n String.format(\n \"EXPORT DATA WITH CONNECTION `%s` OPTIONS(uri='%s', format='%s') \"\n + \"AS SELECT * FROM %s.%s.%s WHERE name LIKE 'W%%'\",\n connectionName, destinationUri, format, projectId, datasetName, externalTableName);\n exportQueryResultsToS3(query);\n }\n\n public static void exportQueryResultsToS3(String query) throws InterruptedException {\n try {\n // Initialize client that will be used to send requests. This client only needs to be created\n // once, and can be reused for multiple requests.\n https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQuery.html bigquery = https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryOptions.html.getDefaultInstance().getService();\n\n https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.TableResult.html results = bigquery.https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQuery.html#com_google_cloud_bigquery_BigQuery_query_com_google_cloud_bigquery_QueryJobConfiguration_com_google_cloud_bigquery_BigQuery_JobOption____(https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.QueryJobConfiguration.html.of(query));\n\n results\n .https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.TableResult.html#com_google_cloud_bigquery_TableResult_iterateAll__()\n .forEach(row -\u003e row.forEach(val -\u003e System.out.printf(\"%s,\", val.toString())));\n\n System.out.println(\"Query results exported to Amazon S3 successfully.\");\n } catch (https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.BigQueryException.html e) {\n System.out.println(\"Query not performed \\n\" + e.toString());\n }\n }\n }\n\nTroubleshooting\n---------------\n\nIf you get an error related to `quota failure`, then check if you have reserved\ncapacity for your queries. For more information about slot reservations, see\n[Before you begin](#before_you_begin) in this document.\n\nWhat's next\n-----------\n\n- Learn about [BigQuery Omni](/bigquery/docs/omni-introduction).\n- Learn how to [export table data](/bigquery/docs/exporting-data).\n- Learn how to [query data stored in Amazon S3](/bigquery/docs/query-aws-data).\n- Learn how to [set up VPC Service Controls for BigQuery Omni](/bigquery/docs/omni-vpc-sc)."]]