BigQuery Data Transfer Client Libraries

This page shows how to get started with the Cloud Client Libraries for the BigQuery Data Transfer API. Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.

Installing the client library

C#

For more information, see Setting Up a C# Development Environment.

Install-Package Google.Cloud.BigQuery.DataTransfer.V1 -Pre

Go

For more information, see Setting Up a Go Development Environment.

go get -u cloud.google.com/go/bigquery/datatransfer/apiv1

Java

For more information, see Setting Up a Java Development Environment.

Si usas Maven, agrega lo siguiente al archivo pom.xml. Para obtener más información sobre las BOM, consulta Las bibliotecas de BOM de Google Cloud Platform.

<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>libraries-bom</artifactId>
      <version>20.1.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>

<dependencies>
  <dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-bigquerydatatransfer</artifactId>
  </dependency>

Si usas Gradle, agrega lo siguiente a las dependencias:

implementation platform('com.google.cloud:libraries-bom:20.1.0')

compile 'com.google.cloud:google-cloud-bigquerydatatransfer'

Si usas sbt, agrega lo siguiente a las dependencias:

libraryDependencies += "com.google.cloud" % "google-cloud-bigquerydatatransfer" % "1.2.8"

Si usas IntelliJ o Eclipse, puedes agregar bibliotecas cliente a tu proyecto mediante los siguientes complementos de IDE:

Los complementos brindan funcionalidades adicionales, como administración de claves para las cuentas de servicio. Consulta la documentación de cada complemento para obtener más detalles.

Node.js

For more information, see Setting Up a Node.js Development Environment.

npm install --save @google-cloud/bigquery-data-transfer

PHP

For more information, see Using PHP on Google Cloud.

composer require google/cloud-bigquerydatatransfer

Python

For more information, see Setting Up a Python Development Environment.

pip install --upgrade google-cloud-bigquery-datatransfer

Ruby

For more information, see Setting Up a Ruby Development Environment.

gem install google-cloud-bigquery-data_transfer

Setting up authentication

To run the client library, you must first set up authentication by creating a service account and setting an environment variable. Complete the following steps to set up authentication. For other ways to authenticate, see the GCP authentication documentation.

Cloud Console

Crea una cuenta de servicio:

  1. En Cloud Console, ve a la página Crear cuenta de servicio.

    Ir a Crear cuenta de servicio
  2. Selecciona un proyecto
  3. Ingresa un nombre en el campo Nombre de cuenta de servicio. Cloud Console completa el campo ID de cuenta de servicio con este nombre.

    En el campo Descripción de la cuenta de servicio, ingresa una descripción. Por ejemplo, Service account for quickstart.

  4. Haga clic en Crear.
  5. Haz clic en el campo Seleccionar una función.

    En Acceso rápido, haz clic en Básica y, luego, en Propietario.

  6. Haga clic en Continuar.
  7. Haz clic en Listo para terminar de crear la cuenta de servicio.

    No cierres la ventana del navegador. La usarás en la próxima tarea.

Para crear una clave de cuenta de servicio, haz lo siguiente:

  1. En Cloud Console, haz clic en la dirección de correo electrónico de la cuenta de servicio que creaste.
  2. Haz clic en Claves.
  3. Haz clic en Agregar clave y, luego, en Crear clave nueva.
  4. Haga clic en Crear. Se descargará un archivo de claves JSON a tu computadora.
  5. Haga clic en Cerrar.

Línea de comandos

Puedes ejecutar los siguientes comandos con el SDK de Cloud en tu máquina local o en Cloud Shell.

  1. Crea la cuenta de servicio. Reemplaza NAME por un nombre para la cuenta de servicio.

    gcloud iam service-accounts create NAME
  2. Otorga permisos a la cuenta de servicio. Reemplaza PROJECT_ID por el ID del proyecto.

    gcloud projects add-iam-policy-binding PROJECT_ID --member="serviceAccount:NAME@PROJECT_ID.iam.gserviceaccount.com" --role="roles/owner"
  3. Genera el archivo de claves. Reemplaza FILE_NAME por un nombre para el archivo de claves.

    gcloud iam service-accounts keys create FILE_NAME.json --iam-account=NAME@PROJECT_ID.iam.gserviceaccount.com

Configura la variable de entorno GOOGLE_APPLICATION_CREDENTIALS para proporcionar credenciales de autenticación al código de la aplicación. Esta variable solo se aplica a la sesión actual de shell. Por lo tanto, si abres una sesión nueva, deberás volver a configurar la variable.

Linux o macOS

export GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"

Reemplaza KEY_PATH por la ruta de acceso del archivo JSON que contiene la clave de tu cuenta de servicio.

Por ejemplo:

export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json"

Windows

Para PowerShell:

$env:GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"

Reemplaza KEY_PATH por la ruta de acceso del archivo JSON que contiene la clave de tu cuenta de servicio.

Por ejemplo:

$env:GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\service-account-file.json"

Para el símbolo del sistema:

set GOOGLE_APPLICATION_CREDENTIALS=KEY_PATH

Reemplaza KEY_PATH por la ruta de acceso del archivo JSON que contiene la clave de tu cuenta de servicio.

Using the client library

The following example shows how to use the client library.

C#

Before trying this sample, follow the C# setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer C# API reference documentation.


using System;
using Google.Api.Gax;
using Google.Cloud.BigQuery.DataTransfer.V1;

namespace GoogleCloudSamples
{
    public class QuickStart
    {
        public static void Main(string[] args)
        {
            // Instantiates a client
            DataTransferServiceClient client = DataTransferServiceClient.Create();

            // Your Google Cloud Platform project ID
            string projectId = "YOUR-PROJECT-ID";

            ProjectName project = new ProjectName(projectId);
            var sources = client.ListDataSources(ParentNameOneof.From(project));
            Console.WriteLine("Supported Data Sources:");
            foreach (DataSource source in sources)
            {
                Console.WriteLine(
                    $"{source.DataSourceId}: " +
                    $"{source.DisplayName} ({source.Description})");
            }
        }
    }
}

Go

Before trying this sample, follow the Go setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer Go API reference documentation.


// Sample bigquery-quickstart creates a Google BigQuery dataset.
package main

import (
	"context"
	"fmt"
	"log"

	"google.golang.org/api/iterator"

	// Imports the BigQuery Data Transfer client package.
	datatransfer "cloud.google.com/go/bigquery/datatransfer/apiv1"
	datatransferpb "google.golang.org/genproto/googleapis/cloud/bigquery/datatransfer/v1"
)

func main() {
	ctx := context.Background()

	// Sets your Google Cloud Platform project ID.
	projectID := "YOUR_PROJECT_ID"

	// Creates a client.
	client, err := datatransfer.NewClient(ctx)
	if err != nil {
		log.Fatalf("Failed to create client: %v", err)
	}
	defer client.Close()

	req := &datatransferpb.ListDataSourcesRequest{
		Parent: fmt.Sprintf("projects/%s", projectID),
	}
	it := client.ListDataSources(ctx, req)
	fmt.Println("Supported Data Sources:")
	for {
		ds, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			log.Fatalf("Failed to list sources: %v", err)
		}
		fmt.Println(ds.DisplayName)
		fmt.Println("\tID: ", ds.DataSourceId)
		fmt.Println("\tFull path: ", ds.Name)
		fmt.Println("\tDescription: ", ds.Description)
	}
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer Java API reference documentation.

// Imports the Google Cloud client library

import com.google.cloud.bigquery.datatransfer.v1.DataSource;
import com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient;
import com.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient.ListDataSourcesPagedResponse;
import com.google.cloud.bigquery.datatransfer.v1.ListDataSourcesRequest;

public class QuickstartSample {
  /**
   * List available data sources for the BigQuery Data Transfer service.
   */
  public static void main(String... args) throws Exception {
    // Sets your Google Cloud Platform project ID.
    // String projectId = "YOUR_PROJECT_ID";
    String projectId = args[0];

    // Instantiate a client. If you don't specify credentials when constructing a client, the
    // client library will look for credentials in the environment, such as the
    // GOOGLE_APPLICATION_CREDENTIALS environment variable.
    try (DataTransferServiceClient client = DataTransferServiceClient.create()) {
      // Request the list of available data sources.
      String parent = String.format("projects/%s", projectId);
      ListDataSourcesRequest request =
          ListDataSourcesRequest.newBuilder()
              .setParent(parent)
              .build();
      ListDataSourcesPagedResponse response = client.listDataSources(request);

      // Print the results.
      System.out.println("Supported Data Sources:");
      for (DataSource dataSource : response.iterateAll()) {
        System.out.println(dataSource.getDisplayName());
        System.out.printf("\tID: %s%n", dataSource.getDataSourceId());
        System.out.printf("\tFull path: %s%n", dataSource.getName());
        System.out.printf("\tDescription: %s%n", dataSource.getDescription());
      }
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer Node.js API reference documentation.

const bigqueryDataTransfer = require('@google-cloud/bigquery-data-transfer');
const client = new bigqueryDataTransfer.v1.DataTransferServiceClient();

async function quickstart() {
  const projectId = await client.getProjectId();

  // Iterate over all elements.
  const formattedParent = client.projectPath(projectId, 'us-central1');
  let nextRequest = {parent: formattedParent};
  const options = {autoPaginate: false};
  console.log('Data sources:');
  do {
    // Fetch the next page.
    const responses = await client.listDataSources(nextRequest, options);
    // The actual resources in a response.
    const resources = responses[0];
    // The next request if the response shows that there are more responses.
    nextRequest = responses[1];
    // The actual response object, if necessary.
    // const rawResponse = responses[2];
    resources.forEach(resource => {
      console.log(`  ${resource.name}`);
    });
  } while (nextRequest);

  console.log('\n\n');
  console.log('Sources via stream:');

  client
    .listDataSourcesStream({parent: formattedParent})
    .on('data', element => {
      console.log(`  ${element.name}`);
    });
}
quickstart();

PHP

Before trying this sample, follow the PHP setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer PHP API reference documentation.

# Includes the autoloader for libraries installed with composer
require __DIR__ . '/vendor/autoload.php';

# Imports the Google Cloud client library
use Google\Cloud\BigQuery\DataTransfer\V1\DataTransferServiceClient;

# Instantiates a client
$bqdtsClient = new DataTransferServiceClient();

# Your Google Cloud Platform project ID
$projectId = 'YOUR_PROJECT_ID';
$parent = sprintf('projects/%s/locations/us', $projectId);

try {
    echo 'Supported Data Sources:', PHP_EOL;
    $pagedResponse = $bqdtsClient->listDataSources($parent);
    foreach ($pagedResponse->iterateAllElements() as $dataSource) {
        echo 'Data source: ', $dataSource->getDisplayName(), PHP_EOL;
        echo 'ID: ', $dataSource->getDataSourceId(), PHP_EOL;
        echo 'Full path: ', $dataSource->getName(), PHP_EOL;
        echo 'Description: ', $dataSource->getDescription(), PHP_EOL;
    }
} finally {
    $bqdtsClient->close();
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer Python API reference documentation.

from google.cloud import bigquery_datatransfer

client = bigquery_datatransfer.DataTransferServiceClient()

# TODO: Update to your project ID.
project_id = "my-project"

# Get the full path to your project.
parent = client.common_project_path(project_id)

print("Supported Data Sources:")

# Iterate over all possible data sources.
for data_source in client.list_data_sources(parent=parent):
    print("{}:".format(data_source.display_name))
    print("\tID: {}".format(data_source.data_source_id))
    print("\tFull path: {}".format(data_source.name))
    print("\tDescription: {}".format(data_source.description))

Ruby

Before trying this sample, follow the Ruby setup instructions in the BigQuery Data Transfer Quickstart Using Client Libraries. For more information, see the BigQuery Data Transfer Ruby API reference documentation.

# Imports the Google Cloud client library
require "google/cloud/bigquery/data_transfer"

# Your Google Cloud Platform project ID
# project_id = "YOUR_PROJECT_ID"

# Instantiate a client
data_transfer = Google::Cloud::Bigquery::DataTransfer.data_transfer_service

# Get the full path to your project.
project_path = data_transfer.project_path project: project_id

puts "Supported Data Sources:"

# Iterate over all possible data sources.
data_transfer.list_data_sources(parent: project_path).each do |data_source|
  puts "Data source: #{data_source.display_name}"
  puts "ID: #{data_source.data_source_id}"
  puts "Full path: #{data_source.name}"
  puts "Description: #{data_source.description}"
end

Additional resources