Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Dokumen ini menjelaskan cara membuat metrik Gemini Code Assist. Misalnya, Anda dapat membuat metrik yang melaporkan penggunaan aktif harian atau penerimaan rekomendasi kode untuk berbagai produk, termasuk Cloud Logging, Google Cloud CLI, Cloud Monitoring, dan BigQuery. Google Cloud
Jika Anda perlu mengaktifkan dan melihat log Gemini untuk Google Cloud
perintah, respons, dan metadata, lihat bagian
Melihat log Gemini untuk Google Cloud .
At the bottom of the Google Cloud console, a
Cloud Shell
session starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
Mencantumkan jumlah pengguna unik
Petunjuk berikut menjelaskan cara menggunakan gcloud CLI untuk mencantumkan
jumlah pengguna unik Gemini Code Assist dalam periode 28 hari terakhir:
Di lingkungan shell, pastikan Anda telah mengupdate semua komponen gcloud CLI yang diinstal ke versi terbaru:
gcloudcomponentsupdate
Membaca entri log untuk pengguna dan penggunaan Gemini Code Assist:
Membuat diagram yang menampilkan penggunaan harian
Langkah-langkah berikut menunjukkan cara menggunakan Monitoring untuk membuat grafik penggunaan harian yang menampilkan total gabungan pengguna aktif harian Gemini Code Assist dan jumlah permintaan mereka per hari.
Buat metrik Monitoring dari data log Anda yang mencatat jumlah pengguna Gemini Code Assist:
Di konsol Google Cloud , buka halaman Logs Explorer:
Jika Anda menggunakan kotak penelusuran untuk menemukan halaman ini, pilih hasil yang subjudulnya adalah
Logging.
Di panel kueri, masukkan kueri berikut, lalu klik
Run query:
resource.type="cloudaicompanion.googleapis.com/Instance" AND labels.product="code_assist" AND jsonPayload.@type="type.googleapis.com/google.cloud.cloudaicompanion.logging.v1.ResponseLog"
Di toolbar, klik Tindakan, lalu pilih Buat metrik.
Dialog Buat metrik berbasis log akan muncul.
Konfigurasi detail metrik berikut:
Pastikan Jenis Metrik disetel ke Penghitung.
Beri nama metrik code_assist_example.
Pastikan Filter selection ditetapkan untuk mengarah ke lokasi tempat log Anda disimpan, baik Project maupun Bucket.
Untuk mengetahui informasi tentang cara membuat metrik Monitoring dari data log Anda, lihat Ringkasan metrik berbasis log.
Klik Create metric.
Banner keberhasilan akan ditampilkan, yang menjelaskan bahwa metrik telah dibuat.
Di banner keberhasilan tersebut, klik Lihat di Metrics explorer.
Metrics Explorer akan terbuka dan menampilkan diagram yang telah dikonfigurasi sebelumnya.
Simpan diagram ke dasbor:
Di toolbar, klik Simpan diagram.
Opsional: Perbarui judul diagram.
Gunakan menu Dasbor untuk memilih dasbor kustom yang ada atau membuat dasbor baru.
Klik Simpan diagram.
Menganalisis penggunaan menggunakan BigQuery
Langkah-langkah berikut menunjukkan cara menggunakan BigQuery untuk menganalisis data log Anda.
Ada dua pendekatan yang dapat Anda gunakan untuk menganalisis data log di BigQuery:
Buat sink log
dan ekspor data log Anda ke set data BigQuery.
Upgrade bucket log yang menyimpan data log Anda untuk menggunakan
Log Analytics,
lalu buat set data BigQuery yang ditautkan.
Dengan kedua pendekatan tersebut, Anda dapat menggunakan SQL untuk membuat kueri dan menganalisis data log, serta membuat diagram hasil kueri tersebut. Jika Anda menggunakan Log Analytics,
Anda dapat menyimpan diagram ke dasbor kustom. Namun, ada perbedaan harga. Untuk mengetahui detailnya, lihat
Harga Log Analytics dan
Harga BigQuery.
Jika Anda menggunakan kotak penelusuran untuk menemukan halaman ini, pilih hasil yang subjudulnya adalah
Logging.
Pilih Google Cloud project tempat asal entri log yang ingin Anda rutekan.
Pilih Buat sink.
Di panel Sink details, masukkan detail berikut:
Untuk Sink name, berikan ID untuk sink. Setelah membuat
sink, Anda tidak dapat mengganti nama sink, tetapi Anda dapat menghapusnya dan membuat sink
baru.
Untuk Deskripsi sink, jelaskan tujuan atau kasus penggunaan sink.
Di panel Sink destination, konfigurasi detail berikut:
Untuk Select sink service, pilih BigQuery dataset.
Untuk Select BigQuery dataset, buat set data BigQuery baru dan beri nama code_assist_bq.
Buka panel Pilih log untuk disertakan dalam sink, lalu di kolom
Buat filter penyertaan, masukkan berikut ini:
Opsional: Untuk memverifikasi bahwa Anda memasukkan filter yang benar, pilih
Pratinjau log. Logs Explorer akan terbuka di tab baru dengan filter yang telah diisi sebelumnya.
Klik Create sink.
Mengizinkan sink log untuk menulis entri log ke set data
Jika Anda memiliki akses Pemilik ke set data BigQuery, Cloud Logging akan memberikan izin yang diperlukan ke sink log untuk menulis data log.
Jika Anda tidak memiliki akses Pemilik atau tidak melihat entri apa pun di set data, sink log mungkin tidak memiliki izin yang diperlukan. Untuk mengatasi
kegagalan ini, ikuti petunjuk di
Menetapkan izin tujuan.
Kueri
Anda dapat menggunakan contoh kueri BigQuery berikut untuk membuat data tingkat pengguna dan gabungan untuk penggunaan aktif harian dan saran yang dihasilkan.
Sebelum menggunakan contoh kueri berikut, Anda harus mendapatkan jalur yang sepenuhnya memenuhi syarat untuk sink yang baru dibuat. Untuk mendapatkan jalur, lakukan hal berikut:
Dalam daftar resource, cari set data bernama code_assist_bq. Data
ini adalah tujuan sink.
Pilih tabel respons dari bawah code_assist_bq_dataset, klik
ikon more_vert, lalu klik
Salin ID untuk membuat ID set data. Catat agar Anda dapat menggunakannya di bagian berikutnya sebagai variabel GENERATED_BIGQUERY_TABLE.
Ganti GENERATED_BIGQUERY_TABLE dengan jalur yang sepenuhnya memenuhi syarat dari
tabel respons BigQuery yang Anda catat di
langkah sebelumnya untuk membuat sink.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[[["\u003cp\u003eNew Gemini Code Assist customers without prior subscriptions receive credits for up to 50 free licenses in their first month, regardless of the edition, and Gemini Code Assist Enterprise is currently available for $19 per month per user with a 12-month commitment until March 31, 2025.\u003c/p\u003e\n"],["\u003cp\u003eYou can use the \u003ccode\u003egcloud\u003c/code\u003e CLI to list unique Gemini Code Assist users over the most recent 28-day period by reading and analyzing log entries.\u003c/p\u003e\n"],["\u003cp\u003eMonitoring can be used to create daily usage graphs showing the number of active Gemini Code Assist users and their requests, derived from log data.\u003c/p\u003e\n"],["\u003cp\u003eBigQuery can analyze Gemini Code Assist log data, either through creating a log sink to export the data or by upgrading the log bucket to use Log Analytics and then creating a linked BigQuery dataset, allowing for SQL queries and result charting.\u003c/p\u003e\n"],["\u003cp\u003eSample queries are provided to allow for the ability to identify individual and aggregate users by day, as well as for individual and aggregate requests per day by user.\u003c/p\u003e\n"]]],[],null,["This document describes how to generate Gemini Code Assist metrics. For\nexample, you can generate metrics that report the daily active usage or the\nacceptance of code recommendations for a variety of Google Cloud products,\nincluding Cloud Logging, Google Cloud CLI, Cloud Monitoring, and\nBigQuery.\n\nIf you need to enable and view Gemini for Google Cloud\nprompt, response, and metadata logs, see\n[View Gemini for Google Cloud logs](/gemini/docs/log-gemini).\n\nBefore you begin\n\n- Ensure you have [set up Gemini Code Assist](/gemini/docs/discover/set-up-gemini) in your project.\n- Ensure you have\n [enabled Gemini for Google Cloud logging](/gemini/docs/log-gemini#enable)\n in your project.\n\n-\n\n\n In the Google Cloud console, activate Cloud Shell.\n\n [Activate Cloud Shell](https://console.cloud.google.com/?cloudshell=true)\n\n\n At the bottom of the Google Cloud console, a\n [Cloud Shell](/shell/docs/how-cloud-shell-works)\n session starts and displays a command-line prompt. Cloud Shell is a shell environment\n with the Google Cloud CLI\n already installed and with values already set for\n your current project. It can take a few seconds for the session to initialize.\n\n \u003cbr /\u003e\n\nList the number of unique users\n\nThe following instructions describe how to use the gcloud CLI to list\nthe number of unique users of Gemini Code Assist in the most\nrecent 28-day period:\n\n1. In a shell environment, ensure that you have updated all installed components\n of the [gcloud CLI](/sdk/gcloud) to the latest version:\n\n gcloud components update\n\n2. Read the log entries for Gemini Code Assist users and usage:\n\n gcloud logging read 'resource.type=cloudaicompanion.googleapis.com/Instance labels.product=~\"code_assist\"' \\\n --freshness 28d \\\n --project \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --format \"csv(timestamp.date('%Y-%m-%d'),labels.user_id)\"\n\n Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with your Google Cloud project ID.\n\n You can use the Unix command `uniq` to uniquely identify users on a per-day\n basis.\n\n The output is similar to the following: \n\n 2024-10-30,user1@company.com\n 2024-10-29,user2@company.com\n 2024-10-29,user2@company.com\n 2024-10-29,user2@company.com\n 2024-10-29,user1@company.com\n 2024-10-28,user1@company.com\n\nCreate a chart that displays daily usage\n\nThe following steps show how to use Monitoring to create daily use\ngraphs that show the aggregate total of daily active Gemini Code Assist\nusers and the number of their requests per day.\n\n1. Create a Monitoring metric from your log data that records\n the number of Gemini Code Assist users:\n\n 1. In the Google Cloud console, go to the **Logs Explorer** page:\n\n [Go to **Logs Explorer**](https://console.cloud.google.com/logs/query)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n 2. In the query pane, enter the following query, and then click\n **Run query**:\n\n resource.type=\"cloudaicompanion.googleapis.com/Instance\" AND labels.product=\"code_assist\" AND jsonPayload.@type=\"type.googleapis.com/google.cloud.cloudaicompanion.logging.v1.ResponseLog\"\n\n | **Note:** The default time period value is **Last 1 hour** , but you can set it to a longer time period (such as **Last 7 days**).\n 3. In the toolbar, click **Actions** , and then select **Create metric**.\n\n The **Create log-based metric** dialog appears.\n 4. Configure the following metric details:\n\n - Ensure the **Metric Type** is set to **Counter**.\n - Name the metric `code_assist_example`.\n - Ensure **Filter selection** is set to point to\n the location where your logs are being stored, either **Project** or\n **Bucket**.\n\n For information about generating Monitoring metrics from\n your log data, see\n [Log-based metrics overview](/logging/docs/logs-based-metrics).\n 5. Click **Create metric**.\n\n A success banner is displayed, explaining the metric was created.\n 6. In that success banner, click **View in Metrics explorer**.\n\n Metrics Explorer opens and displays a preconfigured chart.\n | **Note:** It may take up to 10 minutes for data to populate on the chart. For more information, see [Metric is missing logs data](/logging/docs/logs-based-metrics/troubleshooting#slow-startup).\n2. Save the chart to a dashboard:\n\n 1. In the toolbar, click **Save chart**.\n 2. Optional: Update the chart title.\n 3. Use the **Dashboard** menu either to select an existing custom dashboard or to create a new dashboard.\n 4. Click **Save chart**.\n\nAnalyze usage by using BigQuery\n\nThe following steps show how to use BigQuery to analyze your\nlog data.\n\nThere are two approaches that you can use to analyze your log data in\nBigQuery:\n\n- [Create a log sink](#create-sink) and export your log data to a BigQuery dataset.\n- Upgrade the log bucket that stores your log data to use [Log Analytics](/logging/docs/log-analytics#analytics), and then create a linked BigQuery dataset.\n\nWith both approaches, you can use SQL to query and analyze your log data, and\nyou can chart the results of those queries. If you use Log Analytics,\nthen you can save your charts to a custom dashboard. However, there are\ndifferences in pricing. For details, see\n[Log Analytics pricing](/stackdriver/pricing#logging-costs) and\n[BigQuery pricing](/bigquery/pricing).\n\nThis section describes how to create a log sink to export select log entries\nto BigQuery, and it provides a list of sample queries.\nIf you want to know more about Log Analytics, see\n[Query and analyze logs with Log Analytics](/logging/docs/analyze/query-and-view)\nand [Query a linked BigQuery dataset](/logging/docs/analyze/query-linked-dataset).\n\nCreate a log sink\n\n1. In the Google Cloud console, go to the **Log Router** page:\n\n [Go to **Log Router**](https://console.cloud.google.com/logs/router)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n2. Select the Google Cloud project in which the log entries that you want to route originate.\n3. Select **Create sink**.\n4. In the **Sink details** panel, enter the following details:\n\n - For **Sink name**, provide an identifier for the sink. After you create\n the sink, you can't rename the sink but you can delete it and create a new\n sink.\n\n - For **Sink description**, describe the purpose or use case for the sink.\n\n5. In the **Sink destination** panel, configure the following details:\n\n - For **Select sink service** , select **BigQuery dataset**.\n - For **Select BigQuery dataset** , create a new BigQuery dataset and name it `code_assist_bq`.\n6. Open the **Choose logs to include in sink** panel, and in the\n **Build inclusion filter** field, enter the following:\n\n resource.type=\"cloudaicompanion.googleapis.com/Instance\" AND labels.product=\"code_assist\"\n\n7. Optional: To verify that you entered the correct filter, select\n **Preview logs**. The Logs Explorer opens in a new tab with the filter\n pre-populated.\n\n8. Click **Create sink**.\n\nAuthorize the log sink to write log entries to the dataset\n\nWhen you have Owner access to the BigQuery dataset,\nCloud Logging grants the log sink the necessary permissions to write log\ndata.\n\nIf you don't have Owner access or if you don't see any entries in your\ndataset, then the log sink might not have the required permissions. To resolve\nthis failure, follow the instructions in\n[Set destination permissions](/logging/docs/export/configure_export_v2#dest-auth).\n\nQueries\n\nYou can use the following sample BigQuery queries to generate\nuser- and aggregate-level data for daily active use and suggestions generated.\n\nBefore using the following sample queries, you must obtain the fully qualified\npath for the [newly created sink](#create-sink). To obtain the path, do the following:\n\n1. In the Google Cloud console, go to the **BigQuery** page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the resources list, locate the dataset named `code_assist_bq`. This\n data is the [sink destination](#sink_destination).\n\n3. Select the responses table from beneath the `code_assist_bq_dataset`, click\n the more_vert icon, and then click\n **Copy ID** to generate the dataset ID. Make note of it so that you can use\n it in the following sections as the \u003cvar translate=\"no\"\u003eGENERATED_BIGQUERY_TABLE\u003c/var\u003e variable.\n\nList individual users by day \n\n SELECT DISTINCT labels.user_id as user, DATE(timestamp) as use_date\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n ORDER BY use_date\n\nReplace \u003cvar translate=\"no\"\u003eGENERATED_BIGQUERY_TABLE\u003c/var\u003e with the fully qualified path of the\nBigQuery response table you noted in the\n[previous steps for creating a sink](#create-sink).\n\nList aggregate users by day \n\n SELECT COUNT(DISTINCT labels.user_id) as total_users, DATE(timestamp) as use_date\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n GROUP BY use_date\n ORDER BY use_date\n\nList individual requests per day by user \n\n SELECT COUNT(*), DATE(timestamp) as use_date, labels.user_id as user\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n GROUP BY use_date, user\n ORDER BY use_date\n\nList aggregate requests per day by date \n\n SELECT COUNT(*), DATE(timestamp) as use_date\n FROM \u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eGENERATED_BIGQUERY_TABLE\u003c/span\u003e\u003c/var\u003e\n GROUP BY use_date\n ORDER BY use_date\n\nWhat's next\n\n- Learn more about [Gemini for Google Cloud logging](/gemini/docs/log-gemini).\n- Learn more about [Gemini for Google Cloud monitoring](/gemini/docs/monitor-gemini)."]]