Organiza tus páginas con colecciones
Guarda y categoriza el contenido según tus preferencias.
En esta página, se describe cómo leer varias tablas desde una base de datos de Microsoft SQL Server con la fuentede tablas múltiples.
Usa la fuente de tablas múltiples cuando desees que tu canalización lea desde varias tablas. Si deseas que tu canalización lea desde una sola tabla, consulta Lee desde una tabla de SQL Server.
La fuente de varias tablas genera datos con varios esquemas y, además, incluye un campo de nombre de tabla que indica la tabla de la que provienen. Cuando uses la fuente de tablas múltiples, usa uno de los receptores de varias tablas, Tablas múltiples de BigQuery o Archivos múltiples de GCS.
Antes de comenzar
Sign in to your Google Cloud account. If you're new to
Google Cloud,
create an account to evaluate how our products perform in
real-world scenarios. New customers also get $300 in free credits to
run, test, and deploy workloads.
In the Google Cloud console, on the project selector page,
select or create a Google Cloud project.
Asegúrate de que tu base de datos de SQL Server pueda aceptar conexiones desde Cloud Data Fusion. Para hacerlo de forma segura, te recomendamos crear una instancia privada de Cloud Data Fusion.
Consulta tu instancia de Cloud Data Fusion
Cuando usas Cloud Data Fusion, usas la Google Cloud consola y la IU independiente de Cloud Data Fusion. En la Google Cloud consola, puedes crear un Google Cloud proyecto, y crear y borrar instancias de Cloud Data Fusion. En la IU de Cloud Data Fusion, puedes usar las distintas páginas, como Studio o Wrangler, para usar las funciones de Cloud Data Fusion.
En la Google Cloud consola, ve a la página de Cloud Data Fusion.
Para abrir la instancia en Cloud Data Fusion Studio,
haz clic en Instancias y, luego, en Ver instancia.
Almacena tu contraseña de SQL Server como una clave segura
Agrega tu contraseña de SQL Server como una clave segura para encriptar en tu instancia de Cloud Data Fusion. Más adelante en esta guía, te asegurarás de que tu contraseña se recupere con Cloud KMS.
En la esquina superior derecha de cualquier página de Cloud Data Fusion, haz clic en Administrador del sistema.
Haz clic en la pestaña Configuración.
Haz clic en Realizar llamadas HTTP.
En el menú desplegable, selecciona PUT.
En el campo Ruta de acceso, ingresa namespaces/NAMESPACE_ID/securekeys/PASSWORD.
En el campo Cuerpo (Body), ingresa {"data":"SQL_SERVER_PASSWORD"}.
Haz clic en Enviar.
Asegúrate de que la Respuesta (Response) que recibes sea el código de estado 200.
Obtén el controlador JDBC para SQL Server
Cómo usar el Hub
En la IU de Cloud Data Fusion, haz clic en Hub.
En la barra de búsqueda, ingresa Microsoft SQL Server JDBC Driver.
Haz clic en Microsoft SQL Server JDBC Driver.
Haz clic en Descargar. Sigue los pasos de descarga que se muestran.
Haz clic en Implementar. Sube el archivo JAR del paso anterior.
En la IU de Cloud Data Fusion, haz clic en menuMenú y navega a la página Studio.
Haga clic en addAgregar.
En Driver, haz clic en Upload.
Sube el archivo JAR que descargaste en el paso 2.
Haz clic en Siguiente.
Escribe un nombre para configurar el controlador.
En el campo Nombre de la clase, ingresa com.microsoft.sqlserver.jdbc.SQLServerDriver.
Haz clic en Finalizar.
Implementa los complementos de múltiples tablas
En la IU web de Cloud Data Fusion, haz clic en Centro de noticias.
En la barra de búsqueda, ingresa Multiple table plugins.
Haz clic en Multiple Table Plugins.
Haz clic en Implementar.
Haz clic en Finalizar.
Haz clic en Create a pipeline.
Conéctate a SQL Server
En la IU de Cloud Data Fusion, haz clic en menuMenú y navega a la página Studio.
En Studio, expande el menú Source.
Haz clic en Multiple Database Tables.
Mantén el puntero sobre el nodo Multiple Database Tables y haz clic en Properties.
En el campo Reference name, especifica un nombre de referencia que se usará para identificar tu origen de SQL Server.
En el campo JDBC Connection String, ingresa la cadena de conexión de JDBC. Por ejemplo, jdbc:sqlserver://mydbhost:1433. Para obtener más información, consulta Compila la URL de conexión.
Ingresa el Nombre del complemento JDBC, el Nombre de usuario de la base de datos y la Contraseña de usuario de la base de datos.
Haz clic en Validate.
Haz clic en closeCerrar.
Conéctate a BigQuery o Cloud Storage
En la IU de Cloud Data Fusion, haz clic en menuMenú y navega a la página Studio.
Expande Sink.
Haz clic en BigQuery Multi Table o GCS Multi File.
Conecta el nodo Multiple Database Tables con BigQuery Multi Table o GCS Multi File.
Mantén el puntero sobre el nodo BigQuery Multi Table o GCS Multi File, haz clic en Properties y configura el receptor.
[[["Fácil de comprender","easyToUnderstand","thumb-up"],["Resolvió mi problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Información o código de muestra incorrectos","incorrectInformationOrSampleCode","thumb-down"],["Faltan la información o los ejemplos que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],["Última actualización: 2025-09-04 (UTC)"],[[["\u003cp\u003eThis guide outlines the process of reading data from multiple Microsoft SQL Server tables using the Cloud Data Fusion Multi Table source.\u003c/p\u003e\n"],["\u003cp\u003eThe Multi Table source is used when a pipeline needs to read from multiple tables, in contrast to using a single table source, and it outputs data with multiple schemas while providing a table name field.\u003c/p\u003e\n"],["\u003cp\u003eTo use the Multi Table source, you will need to utilize one of the compatible multi table sinks, either BigQuery Multi Table or GCS Multi File.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves enabling APIs, creating a Cloud Data Fusion instance, securely storing your SQL Server password, getting the appropriate JDBC driver, and deploying multiple table plugins.\u003c/p\u003e\n"],["\u003cp\u003eConnecting to SQL Server and the chosen sink (BigQuery or Cloud Storage) is done through the Cloud Data Fusion Studio, and the guide provides steps to run a preview and deploy the pipeline.\u003c/p\u003e\n"]]],[],null,["# Read from multiple Microsoft SQL Server tables\n\n*** ** * ** ***\n\nThis page describes how to read multiple tables from a Microsoft SQL Server\ndatabase, using the **Multi Table** [source](/data-fusion/docs/concepts/overview#source).\nUse the Multi Table source when you want your pipeline to read from\nmultiple tables. If you want your pipeline to read from a single table, see\n[Reading from a SQL Server table](/data-fusion/docs/how-to/reading-from-sqlserver).\n\nThe Multi Table source outputs data with multiple schemas and includes a\ntable name field that indicates the table from which the data came. When\nusing the Multi Table source, use one of the multi table [sinks](/data-fusion/docs/concepts/overview#sink),\n**BigQuery Multi Table** or **GCS Multi File**.\n\nBefore you begin\n----------------\n\n- Sign in to your Google Cloud account. If you're new to Google Cloud, [create an account](https://console.cloud.google.com/freetrial) to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n-\n\n\n Enable the Cloud Data Fusion, Cloud Storage, BigQuery, and Dataproc APIs.\n\n\n [Enable the APIs](https://console.cloud.google.com/flows/enableapi?apiid=datafusion.googleapis.com,bigquery.googleapis.com,storage.googleapis.com,dataproc.googleapis.com)\n\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n-\n\n\n Enable the Cloud Data Fusion, Cloud Storage, BigQuery, and Dataproc APIs.\n\n\n [Enable the APIs](https://console.cloud.google.com/flows/enableapi?apiid=datafusion.googleapis.com,bigquery.googleapis.com,storage.googleapis.com,dataproc.googleapis.com)\n\n1.\n\n\n Enable the Cloud Data Fusion, Cloud Storage, BigQuery, and Dataproc APIs.\n\n\n [Enable the APIs](https://console.cloud.google.com/flows/enableapi?apiid=datafusion.googleapis.com,bigquery.googleapis.com,storage.googleapis.com,dataproc.googleapis.com)\n2. [Create a Cloud Data Fusion instance](/data-fusion/docs/how-to/create-instance).\n3. Ensure that your SQL Server database can accept connections from Cloud Data Fusion. To do this securely, we recommend that you [create a private\n Cloud Data Fusion instance](/data-fusion/docs/how-to/create-private-ip).\n\n### View your Cloud Data Fusion instance\n\nWhen using Cloud Data Fusion, you use both the Google Cloud console\nand the separate Cloud Data Fusion UI. In the Google Cloud console, you\ncan create a Google Cloud project, and create and delete\nCloud Data Fusion instances. In the Cloud Data Fusion UI, you can use\nthe various pages, such as **Studio** or **Wrangler**, to use\nCloud Data Fusion features.\n\n1. In the Google Cloud console, go to the Cloud Data Fusion page.\n\n2. To open the instance in the Cloud Data Fusion Studio,\n click **Instances** , and then click **View instance**.\n\n[Go to Instances](https://console.cloud.google.com/data-fusion/locations/-/instances) \n\nStore your SQL Server password as a secure key\n----------------------------------------------\n\nAdd your SQL Server password as a secure key to encrypt on your\nCloud Data Fusion instance. Later in this guide, you will ensure that\nyour password is retrieved using [Cloud KMS](/kms/docs).\n\n1. In the top-right corner of any Cloud Data Fusion page, click **System\n Admin**.\n\n2. Click the **Configuration** tab.\n\n3. Click **Make HTTP Calls**.\n\n \u003cbr /\u003e\n\n4. In the dropdown menu, choose **PUT**.\n\n5. In the path field, enter `namespaces/`\u003cvar translate=\"no\"\u003eNAMESPACE_ID\u003c/var\u003e`/securekeys/`\u003cvar translate=\"no\"\u003ePASSWORD\u003c/var\u003e.\n\n6. In the **Body** field, enter `{\"data\":\"`\u003cvar translate=\"no\"\u003eSQL_SERVER_PASSWORD\u003c/var\u003e`\"}`.\n\n7. Click **Send**.\n\nEnsure that the **Response** you get is status code `200`.\n\nGet the JDBC driver for SQL Server\n----------------------------------\n\n### Using the Hub\n\n1. In the Cloud Data Fusion UI, click **Hub**.\n\n2. In the search bar, enter `Microsoft SQL Server JDBC Driver`.\n\n3. Click **Microsoft SQL Server JDBC Driver**.\n\n4. Click **Download**. Follow the download steps shown.\n\n5. Click **Deploy**. Upload the JAR file from the previous step.\n\n6. Click **Finish**.\n\n### Using Studio\n\n1. Visit [Microsoft.com](https://www.microsoft.com/en-us/download/details.aspx?id=11774).\n\n2. Choose your download and click **Download**.\n\n3. In the Cloud Data Fusion UI, click menu\n **Menu** and navigate to the **Studio** page.\n\n4. Click add **Add**.\n\n5. Under **Driver** , click **Upload**.\n\n6. Upload the JAR file downloaded in step 2.\n\n7. Click **Next**.\n\n8. Configure the driver by entering a **Name**.\n\n9. In the **Class name** field, enter `com.microsoft.sqlserver.jdbc.SQLServerDriver`.\n\n10. Click **Finish**.\n\nDeploy the Multiple Table Plugins\n---------------------------------\n\n1. In the Cloud Data Fusion web UI, click **Hub**.\n\n2. In the search bar, enter `Multiple table plugins`.\n\n3. Click **Multiple Table Plugins**.\n\n4. Click **Deploy**.\n\n5. Click **Finish**.\n\n6. Click **Create a Pipeline**.\n\nConnect to SQL Server\n---------------------\n\n1. In the Cloud Data Fusion UI, click menu\n **Menu** and navigate to the **Studio** page.\n\n2. In **Studio** , expand the **Source** menu.\n\n3. Click **Multiple Database Tables**.\n\n4. Hold the pointer over the **Multiple Database Tables** node and click\n **Properties**.\n\n5. In the **Reference name** field, specify a reference name that will be used to\n identify your SQL Server source.\n\n6. In the **JDBC Connection String** field, enter the JDBC connection string. For\n example, `jdbc:sqlserver://mydbhost:1433`. For more information, see\n [Building the connection URL](https://docs.microsoft.com/en-us/sql/connect/jdbc/building-the-connection-url).\n\n7. Enter the **JDBC Plugin Name** , **Database User Name** , and\n **Database User Password**.\n\n8. Click **Validate**.\n\n9. Click close **Close**.\n\nConnect to BigQuery or Cloud Storage\n------------------------------------\n\n1. In the Cloud Data Fusion UI, click menu\n **Menu** and navigate to the **Studio** page.\n\n2. Expand **Sink**.\n\n3. Click **BigQuery Multi Table** or **GCS Multi File**.\n\n4. Connect the **Multiple Database Tables** node with **BigQuery Multi Table**\n or **GCS Multi File**.\n\n5. Hold the pointer over the **BigQuery Multi Table**\n or **GCS Multi File** node, click **Properties**, and configure the sink.\n\n For more information, see [Google BigQuery Multi Table Sink](https://cdap.atlassian.net/wiki/spaces/DOCS/pages/464912385/Google+BigQuery+Multi+Table+Sink) and [Google Cloud Storage Multi File Sink](https://cdap.atlassian.net/wiki/spaces/DOCS/pages/464945223/Google+Cloud+Storage+Multi+File+Sink).\n6. Click **Validate**.\n\n7. Click close **Close**.\n\nRun preview of the pipeline\n---------------------------\n\n1. In the Cloud Data Fusion UI, click menu\n **Menu** and navigate to the **Studio** page.\n\n2. Click **Preview**.\n\n3. Click **Run**. Wait for the preview to finish successfully.\n\nDeploy the pipeline\n-------------------\n\n1. In the Cloud Data Fusion UI, click menu\n **Menu** and navigate to the **Studio** page.\n\n2. Click **Deploy**.\n\nRun the pipeline\n----------------\n\n1. In the Cloud Data Fusion UI,\n click menu **Menu**.\n\n2. Click **List**.\n\n3. Click the pipeline.\n\n4. On the pipeline details page, click **Run**.\n\nWhat's next\n-----------\n\n- Learn more about [Cloud Data Fusion](/data-fusion/docs/concepts/overview).\n- Follow one of the [tutorials](/data-fusion/docs/tutorials)."]]