bqquery--use_legacy_sql=false' SELECT info_type.name,COUNT(info_type.name) AS countFROM `PROJECT_ID.DATASET.TABLE_ID`GROUP BY info_type.name ORDER BY count DESC;'
bqquery--use_legacy_sql=false' SELECT info_type.name,cast(TIMESTAMP_SECONDS(create_time.seconds) as date) as day,COUNT(locations.container_name) AS count FROM `PROJECT_ID.DATASET.TABLE_ID`,UNNEST(location.content_locations) AS locationsGROUP BY info_type.name, day ORDER BY count DESC;'
bqquery--use_legacy_sql=false' SELECT info_type.name, locations.container_name,COUNT(locations.container_name) AS count FROM `PROJECT_ID.DATASET.TABLE_ID`,UNNEST(location.content_locations) AS locationsGROUP BY locations.container_name,info_type.name ORDER BY count DESC;'
テーブルの各列の検出タイプを選択する
このクエリは、すべての検出結果を列名でグループ化し、BigQuery 検査ジョブの検出結果を処理します。このクエリは、指定した列のタイプを特定する場合に利用できます。設定を調整するには、WHERE 句と HAVING 句を変更します。たとえば、複数のテーブルの結果が結果テーブルに含まれている場合、これらの結果を 1 つのジョブ実行または 1 つのテーブル名に制限できます。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-09-04 UTC。"],[],[],null,["# Query Sensitive Data Protection findings in BigQuery\n\nThis page provides example queries that you can use to analyze\nSensitive Data Protection findings that were exported to\nBigQuery.\n\nYou can [configure an inspection job or job\ntrigger](/sensitive-data-protection/docs/creating-job-triggers) to save the findings to\nBigQuery. Doing so lets you query the findings for further\nanalysis. When your findings are exported to BigQuery, the data\nis written to either a new or existing table.\n\nFor more information about all the actions that Sensitive Data Protection can\nperform after inspection, see the [Actions](/sensitive-data-protection/docs/concepts-actions) conceptual topic.\n\nFor more information about running queries, see the following:\n\n- [Run interactive and batch query jobs](/bigquery/docs/running-queries)\n- [Using the `bq` command-line tool](/bigquery/docs/bq-command-line-tool)\n\nColumns of the BigQuery table\n-----------------------------\n\nThe columns of the table of exported findings are based on the attributes of the\n[`Finding`](/sensitive-data-protection/docs/reference/rpc/google.privacy.dlp.v2#finding) object.\n\nLink results back to the rows containing the findings\n-----------------------------------------------------\n\nIf you are [configuring inspection of a BigQuery\ntable](/sensitive-data-protection/docs/inspecting-storage#inspect_a_table), you can set up the job or\njob trigger such that the exported findings will contain the row's identifiers.\nDoing so lets you link the inspection findings back to the rows that contain\nthem.\n\nIn the inspection job or job trigger, set the following fields to the names\nof the columns that uniquely identify each row in the table---that is,\nthe columns that serve the purpose of a primary key:\n\n- If you're using the Google Cloud console, set the **Identifying fields (comma\n separated)** field.\n- If you're using the DLP API, set the [`identifyingFields`](/sensitive-data-protection/docs/reference/rest/v2/InspectJobConfig#BigQueryOptions.FIELDS.identifying_fields) property.\n\nWhen the inspection is done and the findings are exported to\nBigQuery, each finding will contain the corresponding values of\nthe columns that you specified. Those values will be in the\n`location.content_locations.record_location.record_key.id_values` field. You can\nthen use those values to link the finding back to the specific row in the\ninspected BigQuery table.\n| **Note:** The [`rowIndex`](/sensitive-data-protection/docs/reference/rest/v2/InspectResult#tablelocation) property is not intended for use in BigQuery inspection jobs.\n\nSample queries\n--------------\n\nYou can use the following sample queries to analyze your findings. You can\nalso use the queries in a visualization tool such as\n[Looker Studio](https://www.google.com/analytics/data-studio/). These\nqueries are provided to help you get started querying your findings data.\n\nIn each of the following queries, replace the following:\n\n- \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e: the project identifier\n- \u003cvar translate=\"no\"\u003eDATASET\u003c/var\u003e: the BigQuery dataset name\n- \u003cvar translate=\"no\"\u003eTABLE_ID\u003c/var\u003e: the table ID\n\n### Select the count of each infoType\n\n### Google Cloud console\n\n```sql\nSELECT info_type.name,\nCOUNT(info_type.name) AS count\nFROM `PROJECT_ID.DATASET.TABLE_ID`\nGROUP BY info_type.name\nORDER BY count DESC;\n```\n\n### Command-line\n\n```bash\nbq query --use_legacy_sql=false ' SELECT info_type.name,\nCOUNT(info_type.name) AS count\nFROM `\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e.\u003cvar translate=\"no\"\u003eDATASET\u003c/var\u003e.\u003cvar translate=\"no\"\u003eTABLE_ID\u003c/var\u003e`\nGROUP BY info_type.name ORDER BY count DESC;'\n```\n\n### Select the count of each infoType by day\n\n### Google Cloud console\n\n```sql\nSELECT info_type.name, cast(TIMESTAMP_SECONDS(create_time.seconds) as date) as day,\nCOUNT(locations.container_name) AS count\nFROM `PROJECT_ID.DATASET.TABLE_ID`,\nUNNEST(location.content_locations) AS locations\nGROUP BY info_type.name, day\nORDER BY count DESC;\n```\n\n### Command-line\n\n```bash\nbq query --use_legacy_sql=false ' SELECT info_type.name,\ncast(TIMESTAMP_SECONDS(create_time.seconds) as date) as day,\nCOUNT(locations.container_name) AS count FROM `\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e.\u003cvar translate=\"no\"\u003eDATASET\u003c/var\u003e.\u003cvar translate=\"no\"\u003eTABLE_ID\u003c/var\u003e`,\nUNNEST(location.content_locations) AS locations\nGROUP BY info_type.name, day ORDER BY count DESC;'\n```\n\n### Selects the count of each infoType in each container\n\n### Google Cloud console\n\n```sql\nSELECT info_type.name, locations.container_name,\nCOUNT(locations.container_name) AS count\nFROM `PROJECT_ID.DATASET.TABLE_ID`,\nUNNEST(location.content_locations) AS locations\nGROUP BY locations.container_name, info_type.name\nORDER BY count DESC;\n```\n\n### Command-line\n\n```bash\nbq query --use_legacy_sql=false ' SELECT info_type.name, locations.container_name,\nCOUNT(locations.container_name) AS count FROM `\u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e.\u003cvar translate=\"no\"\u003eDATASET\u003c/var\u003e.\u003cvar translate=\"no\"\u003eTABLE_ID\u003c/var\u003e`,\nUNNEST(location.content_locations) AS locations\nGROUP BY locations.container_name,info_type.name ORDER BY count DESC;'\n```\n\n### Selects the finding types found for each column of a table\n\nThis query will group all the findings by column name and is intended to work on\nfindings from a BigQuery inspection job. This query is useful if\nyou are trying to identify the likely types for a given column. You can adjust\nsettings by modifying the WHERE and HAVING clauses. For example, if multiple\ntable results are included in your findings table, you can limit these to just\none job run or one table name. \n\n### Google Cloud console\n\n```sql\nSELECT\n table_counts.field_name,\n STRING_AGG( CONCAT(\" \",table_counts.name,\" [count: \",CAST(table_counts.count_total AS String),\"]\")\n ORDER BY\n table_counts.count_total DESC) AS infoTypes\nFROM (\n SELECT\n locations.record_location.field_id.name AS field_name,\n info_type.name,\n COUNT(*) AS count_total\n FROM\n `PROJECT_ID.DATASET.TABLE_ID`,\n UNNEST(location.content_locations) AS locations\n WHERE\n (likelihood = 'LIKELY'\n OR likelihood = 'VERY_LIKELY'\n OR likelihood = 'POSSIBLE')\n GROUP BY\n locations.record_location.field_id.name,\n info_type.name\n HAVING\n count_total\u003e200 ) AS table_counts\nGROUP BY\n table_counts.field_name\nORDER BY\n table_counts.field_name\n```\n\nThe above query might produce a result like this for a sample table, where\nthe infoTypes column tells us how many instances of each infoType was found\nfor that given column."]]