Planifier une exécution de remplissage
Restez organisé à l'aide des collections
Enregistrez et classez les contenus selon vos préférences.
Lancez un remplissage de données pour charger des données d'historique dans BigQuery. Pour en savoir plus sur le volume de données disponible pour le remplissage, consultez la documentation concernant votre source de données.
En savoir plus
Pour obtenir une documentation détaillée incluant cet exemple de code, consultez les articles suivants :
Exemple de code
Sauf indication contraire, le contenu de cette page est régi par une licence Creative Commons Attribution 4.0, et les échantillons de code sont régis par une licence Apache 2.0. Pour en savoir plus, consultez les Règles du site Google Developers. Java est une marque déposée d'Oracle et/ou de ses sociétés affiliées.
[[["Facile à comprendre","easyToUnderstand","thumb-up"],["J'ai pu résoudre mon problème","solvedMyProblem","thumb-up"],["Autre","otherUp","thumb-up"]],[["Difficile à comprendre","hardToUnderstand","thumb-down"],["Informations ou exemple de code incorrects","incorrectInformationOrSampleCode","thumb-down"],["Il n'y a pas l'information/les exemples dont j'ai besoin","missingTheInformationSamplesINeed","thumb-down"],["Problème de traduction","translationIssue","thumb-down"],["Autre","otherDown","thumb-down"]],[],[[["\u003cp\u003eThis page provides code samples in Java and Python to initiate a data backfill, which loads historical data into BigQuery.\u003c/p\u003e\n"],["\u003cp\u003eTo start a backfill, you must specify a \u003ccode\u003econfigId\u003c/code\u003e along with the start and end times, as demonstrated in the provided code.\u003c/p\u003e\n"],["\u003cp\u003eThe Java code sample uses the \u003ccode\u003eDataTransferServiceClient\u003c/code\u003e to send a \u003ccode\u003eScheduleTransferRunsRequest\u003c/code\u003e, whereas the Python code utilizes \u003ccode\u003eStartManualTransferRunsRequest\u003c/code\u003e to accomplish the task.\u003c/p\u003e\n"],["\u003cp\u003eAuthentication is required to access BigQuery, and the documentation details how to set up Application Default Credentials.\u003c/p\u003e\n"],["\u003cp\u003eFor more detailed information on managing transfers and scheduling queries, the documentation provides links to relevant guides.\u003c/p\u003e\n"]]],[],null,["# Schedule a backfill run\n\nInitiate a data backfill to load historical data into BigQuery. For information about how much data is available for backfill, see the documentation for your data source.\n\nExplore further\n---------------\n\n\nFor detailed documentation that includes this code sample, see the following:\n\n- [Manage transfers](/bigquery/docs/working-with-transfers)\n- [Scheduling queries](/bigquery/docs/scheduling-queries)\n\nCode sample\n-----------\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Java API\nreference documentation](/java/docs/reference/google-cloud-bigquery/latest/overview).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import com.google.api.gax.rpc.https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsRequest.html;\n import com.google.cloud.bigquery.datatransfer.v1.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsResponse.html;\n import com.google.protobuf.https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html;\n import java.io.IOException;\n import org.threeten.bp.Clock;\n import org.threeten.bp.Instant;\n import org.threeten.bp.temporal.ChronoUnit;\n\n // Sample to run schedule back fill for transfer config\n public class ScheduleBackFill {\n\n public static void main(String[] args) throws IOException {\n // TODO(developer): Replace these variables before running the sample.\n String configId = \"MY_CONFIG_ID\";\n Clock clock = Clock.systemDefaultZone();\n Instant instant = clock.instant();\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html startTime =\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html.newBuilder()\n .setSeconds(instant.minus(5, ChronoUnit.DAYS).getEpochSecond())\n .setNanos(instant.minus(5, ChronoUnit.DAYS).getNano())\n .build();\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html endTime =\n https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html.newBuilder()\n .setSeconds(instant.minus(2, ChronoUnit.DAYS).getEpochSecond())\n .setNanos(instant.minus(2, ChronoUnit.DAYS).getNano())\n .build();\n scheduleBackFill(configId, startTime, endTime);\n }\n\n public static void scheduleBackFill(String configId, https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html startTime, https://cloud.google.com/java/docs/reference/protobuf/latest/com.google.protobuf.Timestamp.html endTime)\n throws IOException {\n try (https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html client = https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.html.create()) {\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsRequest.html request =\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsRequest.html.newBuilder()\n .setParent(configId)\n .setStartTime(startTime)\n .setEndTime(endTime)\n .build();\n https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsResponse.html response = client.scheduleTransferRuns(request);\n System.out.println(\"Schedule backfill run successfully :\" + response.https://cloud.google.com/java/docs/reference/google-cloud-bigquerydatatransfer/latest/com.google.cloud.bigquery.datatransfer.v1.ScheduleTransferRunsResponse.html#com_google_cloud_bigquery_datatransfer_v1_ScheduleTransferRunsResponse_getRunsCount__());\n } catch (https://cloud.google.com/java/docs/reference/gax/latest/com.google.api.gax.rpc.ApiException.html ex) {\n System.out.print(\"Schedule backfill was not run.\" + ex.toString());\n }\n }\n }\n\n### Python\n\n\nBefore trying this sample, follow the Python setup instructions in the\n[BigQuery quickstart using\nclient libraries](/bigquery/docs/quickstarts/quickstart-client-libraries).\n\n\nFor more information, see the\n[BigQuery Python API\nreference documentation](/python/docs/reference/bigquery/latest).\n\n\nTo authenticate to BigQuery, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for client libraries](/bigquery/docs/authentication#client-libs).\n\n import datetime\n\n from google.cloud.bigquery_datatransfer_v1 import (\n https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html,\n https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest.html,\n )\n\n # Create a client object\n client = DataTransferServiceClient()\n\n # Replace with your transfer configuration name\n transfer_config_name = \"projects/1234/locations/us/transferConfigs/abcd\"\n now = datetime.datetime.now(datetime.timezone.utc)\n start_time = now - datetime.timedelta(days=5)\n end_time = now - datetime.timedelta(days=2)\n\n # Some data sources, such as scheduled_query only support daily run.\n # Truncate start_time and end_time to midnight time (00:00AM UTC).\n start_time = datetime.datetime(\n start_time.year, start_time.month, start_time.day, tzinfo=datetime.timezone.utc\n )\n end_time = datetime.datetime(\n end_time.year, end_time.month, end_time.day, tzinfo=datetime.timezone.utc\n )\n\n requested_time_range = https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest.html.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.StartManualTransferRunsRequest.TimeRange.html(\n start_time=start_time,\n end_time=end_time,\n )\n\n # Initialize request argument(s)\n request = StartManualTransferRunsRequest(\n parent=transfer_config_name,\n requested_time_range=requested_time_range,\n )\n\n # Make the request\n response = client.https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.DataTransferServiceClient.html#google_cloud_bigquery_datatransfer_v1_services_data_transfer_service_DataTransferServiceClient_start_manual_transfer_runs(request=request)\n\n # Handle the response\n print(\"Started manual transfer runs:\")\n for run in response.runs:\n print(f\"backfill: {run.run_time} run: {run.name}\")\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=bigquerydatatransfer)."]]