여러 Bigtable 열이 포함된 쿼리를 수행하려면 테이블을 피벗해야 합니다. 각각의 새 BigQuery 행에는 Bigtable 테이블의 해당 행에서 변경 내역이 반환한 데이터 변경 레코드 1개가 포함됩니다. 스키마에 따라 row key와 타임스탬프 조합을 사용하여 데이터를 그룹화할 수 있습니다.
동적 열 세트가 있는 경우 모든 열을 가져와서 프로그래매틱 방식으로 쿼리에 넣을 수 있도록 추가 처리를 수행할 수 있습니다.
DECLAREcolsSTRING;SETcols=(SELECTCONCAT('("',STRING_AGG(DISTINCTcolumn,'", "'),'")'),FROMyour_dataset.your_table);EXECUTEIMMEDIATEformat("""SELECT * FROM ( SELECT row_key, timestamp, column, value FROM your_dataset.your_table)PIVOT ( MAX(value) FOR column in %s)""",cols);
JSON 데이터
JSON으로 모든 값을 설정하려면 값을 파싱하고 키를 기반으로 값을 추출해야 합니다. JSON 객체에서 값을 도출한 후에는 파싱 함수를 사용할 수 있습니다. 이 예시에서는 앞에서 소개한 신용카드 판매 데이터를 사용하지만 여러 열에 데이터를 쓰는 대신 데이터가 단일 열에 JSON 객체로 기록됩니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-04(UTC)"],[[["\u003cp\u003eThis guide explains how to query a Bigtable change log that has been written to BigQuery, covering the necessary setup and prerequisites.\u003c/p\u003e\n"],["\u003cp\u003eYou can filter Bigtable change log data in BigQuery to specific column families and columns using \u003ccode\u003eWHERE\u003c/code\u003e clauses.\u003c/p\u003e\n"],["\u003cp\u003eBigtable change log values, initially stored as strings or byte strings, can be converted to their intended data types using BigQuery conversion functions.\u003c/p\u003e\n"],["\u003cp\u003eThe guide demonstrates how to pivot Bigtable data within BigQuery to enable queries across multiple columns, including examples for both fixed and dynamic column sets.\u003c/p\u003e\n"],["\u003cp\u003eIf data is stored as JSON within the Bigtable change log, you can utilize BigQuery's JSON functions to parse the data and perform operations like aggregation.\u003c/p\u003e\n"]]],[],null,["Query a Bigtable change log in BigQuery\n\nThis page provides guidance and example queries to help you process a\nBigtable change log in BigQuery.\n\nThis page is intended for users who have completed the following:\n\n- [Set up a Bigtable change stream-enabled\n table](/bigtable/docs/change-streams-quickstart).\n- Run the [Dataflow\n template](/dataflow/docs/guides/templates/provided/cloud-bigtable-change-streams-to-bigquery) that writes a change log to BigQuery. You can follow the quickstart to learn how to set this up.\n\nThis guide assumes some knowledge of BigQuery. To learn more, you can\nfollow the [quickstart that shows how to load and query\ndata](/bigquery/docs/quickstarts/load-data-console).\n\nOpen the change log table\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** pane, expand your project.\n\n3. Expand your dataset.\n\n4. Click the table with the suffix: `_changelog`.\n\nTable format\n\nThe entire [output\nschema](/dataflow/docs/guides/templates/provided/cloud-bigtable-change-streams-to-bigquery#bigquery-output-table-schema)\ncontains several columns. This guide focuses on connecting the rows to their\ncolumns and values, and parsing values into analyzable formats.\n\nBasic queries\n\nThe examples in this section use a Bigtable table for tracking\ncredit card sales. The table has one column family (`cf`) and the following\ncolumns:\n\n- Row key with the format `credit card number`#`transaction timestamp`\n- Merchant\n- Amount\n- Category\n- Transaction date\n\nQuery one column\n\nFilter the results to just one column family and one column using a `WHERE`\nclause. \n\n SELECT row_key, column_family, column, value, timestamp,\n FROM your_dataset.your_table\n WHERE\n mod_type=\"SET_CELL\"\n AND column_family=\"cf\"\n AND column=\"merchant\"\n LIMIT 1000\n\nParse values\n\nAll values are stored as strings or byte strings. You can cast a value to its\nintended type with [conversion\nfunctions](/bigquery/docs/reference/standard-sql/conversion_functions). \n\n SELECT row_key, column_family, column, value, CAST(value AS NUMERIC) AS amount\n FROM your_dataset.your_table\n WHERE\n mod_type=\"SET_CELL\"\n AND column_family=\"cf\"\n AND column=\"amount\"\n LIMIT 1000\n\nPerform aggregations\n\nYou can perform more operations, such as aggregations on numeric values. \n\n SELECT SUM(CAST(value AS NUMERIC)) as total_amount\n FROM your_dataset.your_table\n WHERE\n mod_type=\"SET_CELL\"\n AND column_family=\"cf\"\n AND column=\"amount\"\n\nPivot the data\n\nTo perform queries that involve multiple Bigtable columns, you\nneed to pivot the table. Each new BigQuery row includes one data change\nrecord returned by the change stream from its corresponding row in your\nBigtable table. Depending on your schema, you can use a\ncombination of the row key and timestamp to group the data. \n\n SELECT * FROM (\n SELECT row_key, timestamp, column, value\n FROM your_dataset.your_table\n )\n PIVOT (\n MAX(value)\n FOR column in (\"merchant\", \"amount\", \"category\", \"transaction_date\")\n )\n\n| **Note:** It is possible for multiple values to appear, so you need to use an [aggregate function](/bigquery/docs/reference/standard-sql/aggregate_functions) on the value.\n\nPivoting with a dynamic columnset\n\nIf you have a dynamic set of columns, you can do some additional processing to\nget all the columns and put them into the query programmatically. \n\n DECLARE cols STRING;\n SET cols = (\n SELECT CONCAT('(\"', STRING_AGG(DISTINCT column, '\", \"'), '\")'),\n FROM your_dataset.your_table\n );\n\n EXECUTE IMMEDIATE format(\"\"\"\n SELECT * FROM (\n SELECT row_key, timestamp, column, value\n FROM your_dataset.your_table\n )\n PIVOT (\n MAX(value)\n FOR column in %s\n )\"\"\", cols);\n\nJSON Data\n\nIf you are setting all values with JSON, you need to parse them and extract the\nvalues based on the keys. You can use parsing functions once you have derived\nthe value from the JSON object. These examples use the [credit card sales data\nintroduced earlier](#basic-queries), but instead of writing data to multiple\ncolumns, the data is written as a single column as a JSON object. \n\n SELECT\n row_key,\n JSON_VALUE(value, \"$.category\") as category,\n CAST(JSON_VALUE(value, \"$.amount\") AS NUMERIC) as amount\n FROM your_dataset.your_table\n LIMIT 1000\n\nAggregation queries with JSON\n\nYou can perform aggregation queries with JSON values. \n\n SELECT\n JSON_VALUE(value, \"$.category\") as category,\n SUM(CAST(JSON_VALUE(value, \"$.amount\") AS NUMERIC)) as total_amount\n FROM your_dataset.your_table\n GROUP BY category\n\nWhat's next\n\n- Learn more about [Bigtable change\n streams](/bigtable/docs/change-streams-overview).\n- Refer to [the Bigtable change stream to BigQuery\n template reference\n page](/dataflow/docs/guides/templates/provided/cloud-bigtable-change-streams-to-bigquery).\n- Learn how to [set up the Bigtable change stream to\n BigQuery\n template](/bigtable/docs/change-streams-to-bigquery-quickstart)."]]