安排執行補充作業
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
啟動資料回填,將歷來資料載入 BigQuery。如要瞭解可回填的資料量,請參閱資料來源的說明文件。
深入探索
如需包含這個程式碼範例的詳細說明文件,請參閱下列內容:
程式碼範例
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis page provides code samples in Java and Python to initiate a data backfill, which loads historical data into BigQuery.\u003c/p\u003e\n"],["\u003cp\u003eTo start a backfill, you must specify a \u003ccode\u003econfigId\u003c/code\u003e along with the start and end times, as demonstrated in the provided code.\u003c/p\u003e\n"],["\u003cp\u003eThe Java code sample uses the \u003ccode\u003eDataTransferServiceClient\u003c/code\u003e to send a \u003ccode\u003eScheduleTransferRunsRequest\u003c/code\u003e, whereas the Python code utilizes \u003ccode\u003eStartManualTransferRunsRequest\u003c/code\u003e to accomplish the task.\u003c/p\u003e\n"],["\u003cp\u003eAuthentication is required to access BigQuery, and the documentation details how to set up Application Default Credentials.\u003c/p\u003e\n"],["\u003cp\u003eFor more detailed information on managing transfers and scheduling queries, the documentation provides links to relevant guides.\u003c/p\u003e\n"]]],[],null,["# Schedule a backfill run\n\nInitiate a data backfill to load historical data into BigQuery. For information about how much data is available for backfill, see the documentation for your data source.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Manage transfers](/bigquery/docs/working-with-transfers)\n- [Scheduling queries](/bigquery/docs/scheduling-queries)\n\nCode sample\n-----------\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.api.gax.rpc.https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsRequest.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsResponse.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html;\n import java.io.IOException;\n import org.threeten.bp.Clock;\n import org.threeten.bp.Instant;\n import org.threeten.bp.temporal.ChronoUnit;\n\n // Sample to run schedule back fill for transfer config\n public class ScheduleBackFill {\n\n public static void main(String[] args) throws IOException {\n // TODO(developer): Replace these variables before running the sample.\n String configId = \"MY_CONFIG_ID\";\n Clock clock = Clock.systemDefaultZone();\n Instant instant = clock.instant();\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html startTime =\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html.newBuilder()\n .setSeconds(instant.minus(5, ChronoUnit.DAYS).getEpochSecond())\n .setNanos(instant.minus(5, ChronoUnit.DAYS).getNano())\n .build();\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html endTime =\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html.newBuilder()\n .setSeconds(instant.minus(2, ChronoUnit.DAYS).getEpochSecond())\n .setNanos(instant.minus(2, ChronoUnit.DAYS).getNano())\n .build();\n scheduleBackFill(configId, startTime, endTime);\n }\n\n public static void scheduleBackFill(String configId, https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html startTime, https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html endTime)\n throws IOException {\n try (https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html client = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html.create()) {\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsRequest.html request =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsRequest.html.newBuilder()\n .setParent(configId)\n .setStartTime(startTime)\n .setEndTime(endTime)\n .build();\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsResponse.html response = client.scheduleTransferRuns(request);\n System.out.println(\"Schedule backfill run successfully :\" + response.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsResponse.html#com_google_cloud_bigquery_datatransfer_v1_ScheduleTransferRunsResponse_getRunsCount__());\n } catch (https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html ex) {\n System.out.print(\"Schedule backfill was not run.\" + ex.toString());\n }\n }\n }\n\n### Python\n\n\nBefore trying this sample, follow the Python setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Python API\nreference documentation](/python/docs/reference/bigquery/latest).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import datetime\n\n from google.cloud.bigquery_datatransfer_v1 import (\n https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html,\n https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest.html,\n )\n\n # Create a client object\n client = DataTransferServiceClient()\n\n # Replace with your transfer configuration name\n transfer_config_name = \"projects/1234/locations/us/transferConfigs/abcd\"\n now = datetime.datetime.now(datetime.timezone.utc)\n start_time = now - datetime.timedelta(days=5)\n end_time = now - datetime.timedelta(days=2)\n\n # Some data sources, such as scheduled_query only support daily run.\n # Truncate start_time and end_time to midnight time (00:00AM UTC).\n start_time = datetime.datetime(\n start_time.year, start_time.month, start_time.day, tzinfo=datetime.timezone.utc\n )\n end_time = datetime.datetime(\n end_time.year, end_time.month, end_time.day, tzinfo=datetime.timezone.utc\n )\n\n requested_time_range = https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest.html.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest.TimeRange.html(\n start_time=start_time,\n end_time=end_time,\n )\n\n # Initialize request argument(s)\n request = StartManualTransferRunsRequest(\n parent=transfer_config_name,\n requested_time_range=requested_time_range,\n )\n\n # Make the request\n response = client.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html#google_cloud_bigquery_datatransfer_v1_services_data_transfer_service_DataTransferServiceClient_start_manual_transfer_runs(request=request)\n\n # Handle the response\n print(\"Started manual transfer runs:\")\n for run in response.runs:\n print(f\"backfill: {run.run_time} run: {run.name}\")\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigquerydatatransfer)."]]