Organiza tus páginas con colecciones
Guarda y categoriza el contenido según tus preferencias.
Crea una canalización de datos
En esta guía de inicio rápido, se muestra cómo hacer lo siguiente:
Crea una instancia de Cloud Data Fusion.
Implementa una canalización de muestra que se proporciona con tu instancia de Cloud Data Fusion. La canalización hace lo siguiente:
Lee un archivo JSON que contiene los datos de bestseller de NYT de Cloud Storage.
Ejecuta transformaciones en el archivo para analizar y limpiar los datos.
Carga en BigQuery los libros mejor calificados que se agregaron durante la última semana y que cuestan menos de $25.
Antes de comenzar
Sign in to your Google Cloud account. If you're new to
Google Cloud,
create an account to evaluate how our products perform in
real-world scenarios. New customers also get $300 in free credits to
run, test, and deploy workloads.
In the Google Cloud console, on the project selector page,
select or create a Google Cloud project.
En las versiones de Cloud Data Fusion 6.2.3 y posteriores, en el campo Autorización, elige la cuenta de servicio de Dataproc para usar en la ejecución de tu canalización de Cloud Data Fusion en Dataproc. Se preselecciona como valor predeterminado la cuenta de Compute Engine.
Haga clic en Crear.
El proceso de creación de la instancia toma hasta 30 minutos en completarse.
Mientras Cloud Data Fusion crea la instancia, se muestra una rueda de progreso junto al nombre de la instancia en la página Instances. Cuando se completa, se convierte en una marca de verificación verde y se indica que puedes comenzar a usar la instancia.
Navega por la interfaz web de Cloud Data Fusion
Cuando usas Cloud Data Fusion, usas la Google Cloud consola y la interfaz web independiente de Cloud Data Fusion.
En la consola de Google Cloud , puedes hacer lo siguiente:
Crea un Google Cloud proyecto de consola
Crea y borra instancias de Cloud Data Fusion
Consulta los detalles de la instancia de Cloud Data Fusion
En la interfaz web de Cloud Data Fusion, puedes usar varias páginas, como Studio o Wrangler, para usar la funcionalidad de Cloud Data Fusion.
Para navegar por la interfaz de Cloud Data Fusion, sigue estos pasos:
En la consola de Google Cloud , abre la página Instancias.
En la columna Acciones de la instancia, haz clic en el vínculo Ver instancia.
En la interfaz web de Cloud Data Fusion, usa el panel de navegación izquierdo para navegar a la página que necesites.
Implementa una canalización de muestra
Las canalizaciones de muestra están disponibles a través del Centro de noticias de Cloud Data Fusion, que te permite compartir canalizaciones, complementos y soluciones reutilizables de Cloud Data Fusion.
En la interfaz web de Cloud Data Fusion, haz clic en Hub.
En el panel izquierdo, haz clic en Canalizaciones.
Haz clic en la canalización de la Guía de inicio rápido de Cloud Data Fusion.
Haz clic en Crear.
En el panel de configuración de inicio rápido de Cloud Data Fusion, haz clic en Finalizar.
Haz clic en Personalizar canalización.
Una representación visual de tu canalización aparece en la página Studio, que es una interfaz gráfica para desarrollar canalizaciones de integración de datos.
Los complementos de canalización disponibles se muestran a la izquierda y tu canalización se muestra en el área de lienzo principal. Para explorar tu canalización, mantén el puntero sobre cada nodo de la canalización y haz clic en Propiedades. El menú de propiedades de cada nodo te permite ver los objetos y las operaciones asociados con el nodo.
En el menú de la esquina superior derecha, haz clic en Implementar. En este paso, se envía la canalización a Cloud Data Fusion. Ejecutarás la canalización en la siguiente sección de esta guía de inicio rápido.
Visualiza tu canalización
La canalización implementada aparecerá en la vista de detalles de la canalización, donde puedes hacer lo siguiente:
Ver la estructura y la configuración de la canalización
Ejecuta la canalización de forma manual o configura una programación o un activador.
Ver un resumen de las ejecuciones históricas de la canalización, incluidos los registros, las métricas y los tiempos de ejecución
Ejecuta tu canalización
En la vista de detalles de la canalización, haz clic en Ejecutar para ejecutar su canalización.
Cuando se ejecuta una canalización, Cloud Data Fusion hace lo siguiente:
Aprovisiona un clúster efímero de Dataproc
Ejecuta la canalización en el clúster con Apache Spark
Eliminación del clúster
Vea los resultados
Después de unos minutos, la canalización finaliza. El estado de la canalización cambia a Finalizada y se muestra la cantidad de registros que procesa cada nodo.
Para ver una muestra de los resultados, ve al conjunto de datos DataFusionQuickstart de tu proyecto, haz clic en la tabla top_rated_inexpensive y, luego, ejecuta una consulta simple. Por ejemplo:
SELECT * FROM PROJECT_ID.GCPQuickStart.top_rated_inexpensive LIMIT 10
Reemplaza PROJECT_ID con el ID del proyecto.
Limpia
Sigue estos pasos para evitar que se apliquen cargos a tu cuenta de Google Cloud por los recursos que usaste en esta página.
[[["Fácil de comprender","easyToUnderstand","thumb-up"],["Resolvió mi problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Información o código de muestra incorrectos","incorrectInformationOrSampleCode","thumb-down"],["Faltan la información o los ejemplos que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],["Última actualización: 2025-09-04 (UTC)"],[[["\u003cp\u003eThis guide demonstrates creating a Cloud Data Fusion instance, which can take up to 30 minutes to provision, and is accessible through both the Google Cloud console and a separate web interface.\u003c/p\u003e\n"],["\u003cp\u003eA sample pipeline is deployed from the Cloud Data Fusion Hub, which reads and transforms JSON data from Cloud Storage, then loads filtered data into BigQuery.\u003c/p\u003e\n"],["\u003cp\u003eThe deployed pipeline is managed in the pipeline details view, allowing users to view its configuration, run it manually, schedule runs, and check its execution history.\u003c/p\u003e\n"],["\u003cp\u003eExecuting the pipeline provisions a temporary Dataproc cluster to process the data using Apache Spark, which is then deleted after completion.\u003c/p\u003e\n"],["\u003cp\u003eAfter the pipeline runs successfully, the processed data can be reviewed by querying the designated BigQuery table, and users can clean up resources, including deleting the BigQuery dataset and the Cloud Data Fusion instance.\u003c/p\u003e\n"]]],[],null,["# Create a data pipeline by using Cloud Data Fusion\n\nCreate a data pipeline\n======================\n\nThis quickstart shows you how to do the following:\n\n1. Create a Cloud Data Fusion instance.\n2. Deploy a sample pipeline that's provided with your Cloud Data Fusion instance. The pipeline does the following:\n 1. Reads a JSON file containing NYT bestseller data from Cloud Storage.\n 2. Runs transformations on the file to parse and clean the data.\n 3. Loads the top-rated books added in the last week that cost less than $25 into BigQuery.\n\nBefore you begin\n----------------\n\n- Sign in to your Google Cloud account. If you're new to Google Cloud, [create an account](https://console.cloud.google.com/freetrial) to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n\n\n Enable the Cloud Data Fusion API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=datafusion.googleapis.com)\n\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n\n\n Enable the Cloud Data Fusion API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=datafusion.googleapis.com)\n\nCreate a Cloud Data Fusion instance\n-----------------------------------\n\n1. Click **Create an instance** .\n\n [Go to Instances](https://console.cloud.google.com/data-fusion/instance-create)\n2. Enter an **Instance name**.\n3. Enter a **Description** for your instance.\n4. Enter the **Region** in which to create the instance.\n5. Choose the Cloud Data Fusion **Version** to use.\n6. Choose the Cloud Data Fusion [**Edition**](/data-fusion/pricing).\n7. For Cloud Data Fusion versions 6.2.3 and later, in the **Authorization** field, choose the [**Dataproc service account**](/dataproc/docs/concepts/configuring-clusters/service-accounts) to use for running your Cloud Data Fusion pipeline in Dataproc. The default value, Compute Engine account, is pre-selected.\n8. Click **Create** . It takes up to 30 minutes for the instance creation process to complete. While Cloud Data Fusion creates your instance, a progress wheel displays next to the instance name on the **Instances** page. After completion, it turns into a green check mark and indicates that you can start using the instance.\n\nNavigate the Cloud Data Fusion web interface\n--------------------------------------------\n\nWhen using Cloud Data Fusion, you use both the Google Cloud console\nand the separate Cloud Data Fusion web interface.\n\n- In the Google Cloud console, you can do the following:\n\n - Create a Google Cloud console project\n - Create and delete Cloud Data Fusion instances\n - View the Cloud Data Fusion instance details\n- In the Cloud Data Fusion web interface, you can use various pages, such\n as **Studio** or **Wrangler**, to use Cloud Data Fusion functionality.\n\nTo navigate the Cloud Data Fusion interface, follow these steps:\n\n1. In the Google Cloud console, open the **Instances** page.\n\n [Go to Instances](https://console.cloud.google.com/data-fusion/locations/-/instances)\n2. In the instance **Actions** column, click the **View Instance** link.\n3. In the Cloud Data Fusion web interface, use the left navigation panel to navigate to the page you need.\n\nDeploy a sample pipeline\n------------------------\n\nSample pipelines are available through the Cloud Data Fusion **Hub**,\nwhich lets you share reusable Cloud Data Fusion pipelines, plugins,\nand solutions.\n\n1. In the Cloud Data Fusion web interface, click **Hub**.\n2. In the left panel, click **Pipelines**.\n3. Click the **Cloud Data Fusion Quickstart** pipeline.\n4. Click **Create**.\n5. In the Cloud Data Fusion Quickstart configuration panel, click **Finish**.\n6. Click **Customize Pipeline**.\n\n A visual representation of your pipeline appears on the **Studio** page,\n which is a graphical interface for developing data integration pipelines.\n Available pipeline plugins are listed on the left, and your pipeline is\n displayed on the main canvas area. You can explore your pipeline by holding\n the pointer over each pipeline *node* and clicking **Properties**. The\n properties menu for each node lets you view the objects and operations\n associated with the node.\n7. In the top-right menu, click **Deploy**. This step submits the pipeline to\n Cloud Data Fusion. You will execute the pipeline in the next section of\n this quickstart.\n\n### View your pipeline\n\nThe deployed pipeline appears in the pipeline details view, where you can do\nthe following:\n\n- View the structure and configuration of the pipeline.\n- Run the pipeline manually or set up a schedule or a trigger.\n- View a summary of historical runs of the pipeline, including execution times, logs, and metrics.\n\nExecute your pipeline\n---------------------\n\nIn the pipeline details view, click **Run** to execute your pipeline.\n\nWhen executing a pipeline, Cloud Data Fusion does the following:\n\n1. Provisions an ephemeral Dataproc cluster\n2. Executes the pipeline on the cluster using Apache Spark\n3. Deletes the cluster\n\n| **Note:** When the pipeline transitions to the *Running* state, you can [monitor the Dataproc cluster creation and deletion](https://console.cloud.google.com/dataproc/clusters). This cluster only exists for the duration of the pipeline.\n\nView the results\n----------------\n\nAfter a few minutes, the pipeline finishes. The pipeline status changes to\n**Succeeded** and the number of records processed by each node is displayed.\n\n1. Go to the [BigQuery web interface](https://console.cloud.google.com/bigquery).\n2. To view a sample of the results, go to the `DataFusionQuickstart` dataset\n in your project, click the\n `top_rated_inexpensive` table, then run a simple query. For example:\n\n SELECT * FROM \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e.GCPQuickStart.top_rated_inexpensive LIMIT 10\n\n Replace \u003cvar translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e with your project ID.\n\nClean up\n--------\n\n\nTo avoid incurring charges to your Google Cloud account for\nthe resources used on this page, follow these steps.\n\n1. [Delete the BigQuery dataset](https://console.cloud.google.com/bigquery) that your pipeline wrote to in this quickstart.\n2. [Delete the Cloud Data Fusion instance](https://console.cloud.google.com/data-fusion/locations/-/instances).\n\n | **Note:** Deleting your instance does not delete any of your data in the project.\n3. Optional: Delete the project.\n\n\u003c!-- --\u003e\n\n| **Caution** : Deleting a project has the following effects:\n|\n| - **Everything in the project is deleted.** If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.\n| - **Custom project IDs are lost.** When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as an `appspot.com` URL, delete selected resources inside the project instead of deleting the whole project.\n|\n|\n| If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects\n| can help you avoid exceeding project quota limits.\n1. In the Google Cloud console, go to the **Manage resources** page.\n\n [Go to Manage resources](https://console.cloud.google.com/iam-admin/projects)\n2. In the project list, select the project that you want to delete, and then click **Delete**.\n3. In the dialog, type the project ID, and then click **Shut down** to delete the project.\n\n\u003cbr /\u003e\n\nWhat's next\n-----------\n\n- Work through a Cloud Data Fusion [tutorial](/data-fusion/docs/tutorials/targeting-campaign-pipeline)\n- Learn about Cloud Data Fusion [concepts](/data-fusion/docs/concepts/overview)"]]