Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Di halaman ini, Anda akan menemukan praktik terbaik untuk menggunakan Datastream. Hal ini mencakup praktik terbaik umum saat menggunakan Datastream.
Mengubah database sumber aliran
Dalam beberapa kasus, Anda mungkin harus mengubah database sumber aliran. Misalnya, Anda mungkin harus mengubah aliran untuk mereplikasi dari replika, bukan dari instance database utama.
Dasbor Aliran Data berisi banyak informasi. Informasi ini dapat berguna untuk tujuan proses debug. Informasi tambahan dapat ditemukan di log, yang tersedia di Cloud Logging.
Pemberitahuan Datastream
Tidak ada pemberitahuan default yang disiapkan untuk Datastream. Misalnya, Anda dapat membuat kebijakan pemberitahuan untuk metrik Keaktualan data dengan mengklik link Buat kebijakan pemberitahuan di tab Ringkasan. Untuk metrik lainnya, ikuti langkah-langkah berikut:
Di konsol Google Cloud , buka halaman notificationsAlerting:
Sebaiknya buat pemberitahuan untuk metrik Datastream berikut:
Keaktualan data
Jumlah peristiwa yang tidak didukung di aliran
Total latensi streaming
Peringatan pada salah satu metrik ini dapat menunjukkan masalah pada aliran atau database sumber.
Berapa banyak tabel yang dapat ditangani oleh satu aliran?
Sebaiknya satu aliran menyertakan hingga 10.000 tabel. Tidak ada batasan ukuran tabel. Jika Anda perlu membuat streaming dengan lebih banyak tabel, streaming tersebut mungkin mengalami error. Untuk menghindari hal ini, pertimbangkan untuk membagi sumber menjadi beberapa aliran.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[[["\u003cp\u003eThis page covers best practices for using Datastream, including modifying a stream's source database and creating custom alerts.\u003c/p\u003e\n"],["\u003cp\u003eTo change a stream's source database, create a connection profile for the replica instance, start a new stream with historical backfill disabled, and then pause the original stream.\u003c/p\u003e\n"],["\u003cp\u003eDatastream offers a dashboard for debugging, and custom alerting policies can be set up for key metrics like data freshness, stream unsupported event count, and stream total latencies.\u003c/p\u003e\n"],["\u003cp\u003eFor optimal performance, it is recommended that a single Datastream stream includes no more than 10,000 tables, with consideration for the impact on the source database if multiple streams are used.\u003c/p\u003e\n"]]],[],null,["# General best practices when using Datastream\n\nIn this page, you'll find best practices for using Datastream. These include general best practices when using Datastream.\n\nChange a stream's source database\n---------------------------------\n\nIn some cases, you may have to change the source database of a stream. For example, you may have to modify the stream to replicate from a replica instead of from the primary database instance.\n\n1. [Create a connection profile](/datastream/docs/create-connection-profiles) for the replica instance.\n2. [Create a stream](/datastream/docs/create-a-stream), using the connection profile for the replica that you created and the existing connection profile for the destination.\n3. [Start the stream](/datastream/docs/run-a-stream#startastream) with historical backfill disabled. When the stream is started, it will bring only the data from the binary logs.\n4. Optional. After the stream is running, [modify it](/datastream/docs/modify-a-stream) to enable automatic backfill.\n5. [Pause the stream](/datastream/docs/run-a-stream#pauseastream) that's reading from the primary instance.\n6. Optional. [Delete the stream](/datastream/docs/delete-a-stream) that was streaming data from the primary instance.\n7. Optional. [Delete the connection profile](/datastream/docs/delete-a-connection-profile) for the primary instance.\n\nAlert and monitor in Datastream\n-------------------------------\n\nThe Datastream dashboard contains a great deal of information. This information can be helpful for debugging purposes. Additional information can be found in the logs, which are available in Cloud Logging.\n| **Tip:** Use [Google Cloud Monitoring](/monitoring) to create a custom dashboard to suit your business needs.\n\n### Datastream alerts\n\nThere's no default alert set up for Datastream. For example, you can create an alerting policy for the **Data freshness** metric by clicking the *Create alerting policy* link in the **Overview** tab. For the remaining metrics, follow these steps:\n\n1. In the Google Cloud console, go to the *notifications* **Alerting** page:\n\n [Go to **Alerting**](https://console.cloud.google.com/monitoring/alerting)\n2. Click **Create policy**.\n\n3. Click the **Select a metric** drop-down.\n\n4. In the filter field, enter `Datastream`.\n\n5. Optional: You might need to disable the **Active** filter to view all available metrics.\n\n6. Search for the metric that you want to monitor under **Datastream Stream**.\n\n7. Click **Apply**.\n\n8. Optional: Enter the required details in the **Add filters** and **Transform data** sections. Click **Next**.\n\n9. Enter the required information in the **Configure alert trigger** section. Click **Next**.\n\n10. Configure your notifications in the **Configure notifications and finalize alert** section.\n\n11. Review your alert and click **Create policy** when ready.\n\n For detailed information about how to complete each of these steps, see\n [Create alerting policy](/monitoring/alerts/using-alerting-ui#create-policy).\n\nWe recommend creating alerts for the following Datastream[metrics](/datastream/docs/monitor-a-stream#monitorstreams):\n\n- Data freshness\n- Stream unsupported event count\n- Stream total latencies\n\nAn alert on any of these metrics can indicate a problem with either the stream or the source database.\n\nHow many tables can a single stream handle?\n-------------------------------------------\n\nWe recommend that a single stream includes up to 10,000 tables. There's no limit to the size of the tables. If you need to create a stream with more tables, then the stream might enter an error state. To avoid this, consider splitting the source into multiple streams.\n| Keep in mind the impact on the source database. Each stream will have its own limit on the number of connections and simultaneous tasks, so combining multiple streams could overwhelm the database."]]