Using DataStream APIs

For businesses with many siloed data sources, access to enterprise data across the organization, especially in a real-time manner, is prohibitively difficult. This results in a world of limited and slow data access, inhibiting the organization's ability to introspect and to innovate.

Datastream breaks down the barriers of siloed organizational data by providing near-real-time access to change data from a variety of on-premises and cloud-based data sources. Datastream provides a simple setup experience and a unified consumption API which democratizes the organization's access to the freshest enterprise data available across the organization, powering integrated near-real-time scenarios.

One such scenario is transferring data from a source database into a Cloud-based storage service and transforming this data into a language that's readable by other applications and services that communicate with this storage service.

In this tutorial, you learn how to use Datastream to transfer schemas, tables, and data from a source Oracle database into a folder in a Cloud Storage bucket. Cloud Storage is a web service for storing and accessing data on Google Cloud. The service combines the performance and scalability of Google's cloud with advanced security and sharing capabilities.

As part of transferring this information into a folder in the destination Cloud Storage bucket, Datastream translates this information into Avro. Avro is defined by a schema that's written in JavaScript Object Notation (JSON), and the data and tables embedded in this schema can be read across any language.

Objectives

In this tutorial, you learn how to:

  • Set your variables. You'll use these variables when you make requests to Datastream to create and manage both connection profiles and a stream.
  • Create and manage connection profiles for a source database and a destination bucket in Cloud Storage. By creating these connection profiles, you're creating records that contain information about the source database and destination Cloud Storage bucket. The stream in Datastream uses the information in the connection profiles to transfer data from the source database into a folder in the destination bucket.
  • Create and manage a stream. Datastream uses this stream to transfer data, schemas, and tables from the source database into a folder in the destination bucket.
  • Verify that Datastream transfers the data and tables associated with a schema of the source Oracle database into a folder in the destination bucket, and translates this data into the Avro file format.
  • Clean up the resources that you created on Datastream so they won't take up quota and you won't be billed for them in the future.

Costs

This tutorial uses the following billable components of Google Cloud:

  • Datastream
  • Cloud Storage

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  5. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  6. Make sure that you have a source database that Datastream can access. For this tutorial, an Oracle database is used as the source.
  7. Make sure that you have configured a destination Cloud Storage bucket that Datastream can access by using either the IP allowlist or forward SSH tunnel network connectivity method.
  8. Make sure that you have data, tables, and schemas in the source database that Datastream can transfer into a folder in the destination Cloud Storage bucket.
  9. Download and install Google Cloud Shell.This client application provides you with command-line access to your cloud resources (including Datastream).
  10. Install and configure the jq utility. This utility is a lightweight and flexible command-line JSON processor. You'll use this processor to display complex cURL commands in easy-to-read text.

Setting your variables

In this section, you'll set the following variables:

  • $PROJECT: This variable is associated with your Google Cloud project. Any Google Cloud resources that you allocate and use must belong to a project.
  • $TOKEN: This variable is associated with an access token. The access token provides a session that Google Cloud Shell uses to perform tasks in Datastream via REST APIs.
  • $ENV: This variable is associated with the environment that you're using to transfer data, schemas, and tables from a source database into a folder in a destination Cloud Storage bucket.
  1. Launch your Google Cloud Shell application.

  2. After authenticating into your application, enter gcloud auth login.

  3. At the Do you want to continue (Y/n)? prompt, enter Y.

  4. Open a web browser, and then copy the URL provided into the browser.

  5. Authenticate into Google Cloud SDK. A code appears on the Sign in page. This code is your access token.

  6. Copy the access token, paste it into the Enter verification code: parameter in your Google Cloud Shell application, and press Enter.

  7. At the prompt, enter PROJECT="[YOUR_PROJECT_NAME]" to set the $PROJECT environment variable to your Google Cloud project.

  8. At the prompt, enter gcloud config set project [YOUR_PROJECT_NAME] to set the project that you'd like to work on to your Google Cloud project.

    Your command prompt will be updated to reflect your currently active project and will respect this format: [USERNAME]@cloudshell:~ ([YOUR_PROJECT_NAME])$

  9. At the prompt, enter TOKEN=$(gcloud auth print-access-token) to retrieve the access token and store it as a variable.

  10. At the prompt, enter ENV="[YOUR_ENVIRONMENT_NAME]" to set the environment variable to your environment.

  11. At the prompt, enter the following commands to ensure that your $PROJECT, $TOKEN, and $ENV variables are set correctly:

    • echo $PROJECT
    • echo $TOKEN
    • echo $ENV

Now that you set your variables, you can make requests to Datastream to create and manage both connection profiles and a stream.

Creating and managing connection profiles

In this section, you create and manage connection profiles for a source Oracle database and a destination bucket in Cloud Storage.

By creating these connection profiles, you're creating records that contain information about the source database and destination Cloud Storage bucket. Datastream uses the information in the connection profiles to transfer data from the source database into a folder in the destination bucket.

Creating and managing connection profiles includes:

  • Creating connection profiles for a source Oracle database and a destination bucket in Cloud Storage
  • Retrieving high-level and detailed information about a connection profile
  • Modifying a connection profile
  • Performing a discover API call on the source Oracle connection profile. This allows you to look inside the database to see the objects associated with it. These objects include the schemas and tables that contain the data of the database. This way, when you use Datastream to configure a stream, you may not want to pull all objects of the database, but rather a subset of the objects (for example, only certain tables and schemas of the database). Use the discover API to help you find (or discover) the subset of database objects that you want to pull.

Create connection profiles

In this section, you create two connection profiles to a source Oracle database and a destination bucket in Cloud Storage.

  1. Create a connection profile to a source Oracle database. At the prompt, enter the following command:

    ORACLE="{\"displayName\":\"[DISPLAY_NAME]\",\"oracle_profile\":{\"hostname\":\"[HOSTNAME],\"username\":\"[USERNAME]\",\"database_service\":\"[DATABASE_SERVICE]\",\"password\":\"[PASSWORD]\",\"port\":[PORT_NUMBER]},\"no_connectivity\":{}}"
    

    Use the following table to help you understand the parameter values for the source Oracle database:

    Parameter valueReplace with
    [DISPLAY_NAME]The display name of the connection profile to the source database.
    [HOSTNAME]The host name of the source database server.
    [USERNAME]The username of the account for the source database (for example, ROOT).
    [DATABASE_SERVICE]The service that ensures that the source database is protected and monitored. For Oracle databases, the database service is typically ORCL.
    [PASSWORD]The password of the account for the source database.
    [PORT_NUMBER]The port number reserved for the source database. For an Oracle database, the port number is typically 1521.
  2. At the prompt, enter the echo $ORACLE | jq command to see the source connection profile you created in easy-to-read text.

    {
      "displayName": "[DISPLAY_NAME]",
      "oracle_profile": {
        "hostname": "[HOSTNAME]",
        "username": "[USERNAME]",
        "database_service": "[DATABASE_SERVICE]",
        "password": "[PASSWORD]",
        "port": [PORT_NUMBER]
       },
      "no_connectivity": {}
    }
    
  3. Submit the Oracle connection profile so that it can be created. At the prompt, enter the following command:

    curl -X POST -d $ORACLE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles?connection_profile_id=[SOURCE_CONNECTION_PROFILE_ID]
    

    Use the following table to help you understand the parameter values for this command:

    Parameter valueReplace with
    [DATASTREAM_VERSION]The current version of Datastream (for example, v1alpha1).
    [PROJECT_PATH]The full path of your Google Cloud project (for example, projects/$PROJECT/locations/[YOUR_PROJECT_LOCATION]).
    [SOURCE_CONNECTION_PROFILE_ID]The unique identifier reserved for this connection profile (for example, cp-1).
  4. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[SOURCE_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "datastream.googleapis.com/[DATASREAM_VERSION]/[PROJECT_PATH]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "verb": "create",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  5. Create a connection profile to a destination bucket in Cloud Storage. At the prompt, enter the following command:

    GOOGLECLOUDSTORAGE="{\"displayName\":\"[DISPLAY_NAME]\",\"gcs_profile\":{\"bucket_name\":\"[BUCKET_NAME]\",\"root_path\":\"/[FOLDER_PATH]\"},\"no_connectivity\":{}}"
    

    Use the following table to help you understand the parameter values for the destination bucket:

    Parameter valueReplace with
    [DISPLAY_NAME]The display name of the connection profile to the destination bucket.
    [BUCKET_NAME]The name of the destination bucket.
    [FOLDER_PATH]The folder in the destination bucket into which Datastream will transfer data from the source database (for example, /root/path).
  6. At the prompt, enter the echo $GOOGLECLOUDSTORAGE | jq command to see the destination connection profile you created in easy-to-read text.

    {
      "displayName": "[DISPLAY_NAME]",
      "gcs_profile": {
        "bucket_name": "[BUCKET_NAME]",
        "root_path": "/[FOLDER_PATH]"
      },
      "no_connectivity": {}
    }
    
  7. Submit the Cloud Storage connection profile so that it can be created. At the prompt, enter the following command:

    curl -X POST -d $GOOGLECLOUDSTORAGE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles?connection_profile_id=[DESTINATION_CONNECTION_PROFILE_ID]
    
  8. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "datastream.googleapis.com/[DATASREAM_VERSION]/[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "verb": "create",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  9. Confirm that both connection profiles are created. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles
    
  10. Verify that you receive two returned results for both the source and destination connection profiles.

    {
      "connectionProfiles": [
        {
          "name": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
          "createTime": "[DATE_AND_TIME_STAMP]",
          "updateTime": "[DATE_AND_TIME_STAMP]",
          "displayName": "[DISPLAY_NAME]",
          "gcsProfile": {
            "bucketName": "[BUCKET_NAME]",
            "rootPath": "[FOLDER_PATH]"
          },
          "noConnectivity": {}
        },
       {
        "name": "[PROJECT_PATH]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "updateTime": "[DATE_AND_TIME_STAMP]",
        "displayName": "[DISPLAY_NAME]",
        "oracleProfile": {
          "hostname": "[HOSTNAME]",
          "port": [PORT_NUMBER],
          "username": "[USERNAME]",
          "databaseService": "[DATABASE_SERVICE]"
        },
        "noConnectivity": {}
        }
      ]
    }
    

Manage connection profiles

In this section, you manage the connection profiles that you created for a source Oracle database and a destination bucket in Cloud Storage. This includes:

  • Retrieving high-level and detailed information about the destination Cloud Storage connection profile
  • Modifying this connection profile. For this tutorial, you'll change the folder of the destination Cloud Storage bucket to /root/tutorial. Datastream will transfer data from the source database into this folder.
  • Performing a discover API call on the source Oracle connection profile
  1. Retrieve high-level information about the destination Cloud Storage connection profile. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]
    
  2. Verify that you see high-level information about this connection profile.

    {
      "name": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
      "createTime": "[DATE_AND_TIME_STAMP]",
      "updateTime": "[DATE_AND_TIME_STAMP]",
      "displayName": "[DISPLAY_NAME]",
      "gcsProfile": {
        "bucketName": "[BUCKET_NAME]",
        "rootPath": "[FOLDER_PATH]"
      },
      "noConnectivity": {}
    }
    
  3. Retrieve detailed information about this connection profile. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/operations/[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]
    
  4. Verify that you see detailed information about the connection profile.

    {
      "name": "[PROJECT_PATH]/operations/operation-[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "endTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "verb": "create",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": true,
      "response": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].ConnectionProfile",
        "name": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "updateTime": "[DATE_AND_TIME_STAMP]",
        "displayName": "[DISPLAY_NAME]",
        "gcsProfile": {
          "bucketName": "[BUCKET_NAME]",
          "rootPath": "[FOLDER_PATH]"
        },
        "noConnectivity": {}
      }
    }
    
  5. Modify this connection profile. To do this, first, set an UPDATE variable. This variable contains the values of the connection profile that you want to change. For this tutorial, you'll change the folder of the destination bucket to /root/tutorial.

    To set the variable, at the prompt, enter the following command:

    UPDATE="{\"gcsProfile\":{\"rootPath\":\"/root/tutorial\"}}"
  6. At the prompt, enter the following command:

    curl -X PATCH -d $UPDATE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]?update_mask=gcsProfile.rootPath
    
  7. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "verb": "update",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  8. Confirm that the connection profile is modified. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/operations/[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]
    
  9. Verify that the folder of the destination bucket of the Cloud Storage connection profile is now /root/tutorial.

    {
      "name": "[PROJECT_PATH]/operations/operation-[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "endTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "verb": "update",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": true,
      "response": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].ConnectionProfile",
        "name": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "updateTime": "[DATE_AND_TIME_STAMP]",
        "displayName": "[DISPLAY_NAME]",
        "gcsProfile": {
          "bucketName": "[BUCKET_NAME]",
          "rootPath": "/root/tutorial"
        },
        "noConnectivity": {}
      }
    }
    
  10. Use the Datastream discover API to discover the schemas and tables of the source Oracle database. Datastream will access this database through the source connection profile.

    1. First, discover the schemas of the Oracle database. At the prompt, enter the following command:

      curl -X POST -d "{"connection_profile_name":"projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]"}" -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles:discover
        

    2. Verify that Datastream retrieves all schemas of your database.

    3. Next, retrieve the tables of a schema in your database. For this tutorial, you'll use the discover API to retrieve the tables of the ROOT schema. However, you can discover the tables of any schema in your database.

      At the prompt, enter the following command:

       
      curl -X POST -d "{\"connection_profile_name\":\"projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]\", \"oracle_rdbms\":{\"oracleSchemas\":[{\"schemaName\":\"ROOT\"}]}}" -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles:discover
      
    4. Verify that Datastream retrieves all tables of the schema that you specified (for this tutorial, the ROOT schema).

Now that you created and managed connection profiles for a source Oracle database and a destination bucket in Cloud Storage, you're ready to create and manage a stream in Datastream.

Creating and managing a stream

In this section, you create and manage a stream. Datastream uses this stream to transfer data, schemas, and tables from the source database into a folder in the destination Cloud Storage bucket.

Creating and managing a stream includes:

  • Creating a stream with the following lists:
    • An allow list. This list specifies the tables and schemas in the source database that Datastream can transfer into a folder in the destination bucket in Cloud Storage. For this tutorial, this is the /root/tutorial folder.
    • A reject list. This list specifies the tables and schemas in the source database that Datastream is restricted from transferring into the folder in the Cloud Storage destination bucket.
  • Retrieving high-level and detailed information about the stream
  • Modifying the stream
  • Using the Fetch Errors API to detect any errors associated with the stream
  • Starting the stream so that Datastream can transfer data, schemas, and tables from the source database into a folder in the destination Cloud Storage bucket.
  • Pausing the stream. When a stream is paused, Datastream won't pull any new data from the source database into the destination bucket.
  • Resuming the paused stream so that Datastream can continue to transfer data into the destination bucket.

Create a stream

In this section, you create a stream from the source Oracle database into a folder in the destination Cloud Storage bucket. The stream you create will include both an allow list and a reject list.

  1. Set a SCHEMAS variable. This variable defines the schemas that contain the data and tables that you want Datastream to retrieve from the source database and transfer into the /root/tutorial folder of the Cloud Storage destination bucket. For this tutorial, you'll set the SCHEMAS variable to be associated with the ROOT schema.

    At the prompt, enter the following command:

    SCHEMAS="{\"oracleSchemas\":[{\"schemaName\":\"ROOT\"}]}"
    
  2. At the prompt, enter the echo $SCHEMAS | jq command to see the ROOT schema that you defined for this variable in easy-to-read text.

  3. Create a stream. At the prompt, enter the following command:

    STREAM="{\"display_name\":\"[DISPLAY_NAME]\",\"source_config\":{\"source_connection_profile_name\":\"[PROJECT_PATH]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",\"oracle_source_config\":{\"allowlist\":$SCHEMAS,\"rejectlist\":{}}},\"destination_config\":{\"destination_connection_profile_name\":\"[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]\",\"gcs_destination_config\":{\"file_rotation_mb\":5,\"file_rotation_interval\":{\"seconds\":15},\"gcs_file_format\":\"AVRO\"}}}"
    
  4. At the prompt, enter the echo $STREAM | jq command to see the stream you created in easy-to-read text.

    {
      "display_name": "[DISPLAY_NAME]",
      "source_config": {
        "source_connection_profile_name": "[PROJECT_PATH]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "oracle_source_config": {
          "allowlist": {
            "oracleSchemas": [
              {
                "schemaName": "ROOT"
              }
            ]
          },
          "rejectlist": {}
        }
      },
      "destination_config": {
        "destination_connection_profile_name": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "gcs_destination_config": {
          "file_rotation_mb": 5,
          "file_rotation_interval": {
            "seconds": 15
          },
          "gcs_file_format": "AVRO"
        }
      }
    }
    

    Use this table to help you understand the following parameters of the stream:

    ParameterDescription
    allowlistThe schemas, containing tables and data, that will be transferred from the source database into a folder of the Cloud Storage destination bucket. For this tutorial, all tables and data from the ROOT schema (and only this schema) will be transferred into the /root/tutorial folder of the destination bucket.
    rejectlistAny schemas, containing tables and data, that won't be transferred into a folder of the Cloud Storage destination bucket. For this tutorial, the {} value signifies that no tables and data from the source database will be prevented from being transferred into the destination bucket.
    file_rotation_mbThe maximum size (in MBytes) of files containing data, tables, and schemas that can be transferred from the source database into a folder in the Cloud Storage destination bucket. For this tutorial, the maximum file size is 5 MBytes. If any files exceed this size, then they'll be segmented into multiple 5-MB files.
    file_rotation_intervalHow many seconds will elapse before Datastream will replace an existing file in a folder of the Cloud Storage destination bucket with a newer file being transferred from the source database. For this tutorial, the file rotation interval is set to 15 seconds.
    gcs_file_formatThe format of the files that Datastream will transfer from the source database into a folder of the Cloud Storage destination bucket. For this tutorial, Avro (AVRO) is the file format.
  5. Submit the stream so that it can be created. At the prompt, enter the following command:

    curl -X POST -d $STREAM -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams?stream_id=[STREAM_ID]
    
  6. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "create",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  7. Confirm that the stream is created. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams
    
  8. Verify that you receive a returned result for the stream that you created.

    {
      "streams": [
        {
          "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
          "createTime": "[DATE_AND_TIME_STAMP]",
          "updateTime": "[DATE_AND_TIME_STAMP]",
          "displayName": "[DISPLAY_NAME]",
          "sourceConfig": {
            "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
            "oracleSourceConfig": {
              "allowlist": {
                "oracleSchemas": [
                  {
                    "schemaName": "ROOT"
                  }
                ]
              },
              "rejectlist": {}
            }
          },
          "destinationConfig": {
            "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
            "gcsDestinationConfig": {
              "gcsFileFormat": "AVRO",
              "fileRotationMb": 5,
              "fileRotationInterval": "15s"
            }
          },
          "state": "CREATED"
        }
      ]
    }
    

Manage the stream

In this section, you manage the stream that you created to transfer data from a source Oracle database into a folder in a Cloud Storage destination bucket. This includes:

  • Retrieving high-level and detailed information about the stream
  • Modifying the stream
  • Using the Fetch Errors API to detect any errors associated with the stream
  • Starting, pausing, and resuming the stream
  1. Retrieve high-level information about the stream. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]
    
  2. Verify that you see high-level information about this stream.

    {
      "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
      "createTime": "[DATE_AND_TIME_STAMP]",
      "updateTime": "[DATE_AND_TIME_STAMP]",
      "displayName": "[DISPLAY_NAME]",
      "sourceConfig": {
        "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "oracleSourceConfig": {
          "allowlist": {
            "oracleSchemas": [
              {
                "schemaName": "ROOT"
              }
            ]
          },
          "rejectlist": {}
         }
        },
        "destinationConfig": {
          "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
          "gcsDestinationConfig": {
            "gcsFileFormat": "AVRO",
            "fileRotationMb": 5,
            "fileRotationInterval": "15s"
          }
        },
        "state": "CREATED"
      }
    
  3. Retrieve detailed information about this stream. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/operations/[STREAM_OPERATION_ID]
    
  4. Verify that you see detailed information about the stream.

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "endTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "create",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": true,
      "response": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].Stream",
        "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "updateTime": "[DATE_AND_TIME_STAMP]",
        "displayName": "[DISPLAY_NAME]",
        "sourceConfig": {
          "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
          "oracleSourceConfig": {
            "allowlist": {
              "oracleSchemas": [
                {
                  "schemaName": "ROOT"
                }
              ]
            },
            "rejectlist": {}
          }
        },
        "destinationConfig": {
          "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
          "gcsDestinationConfig": {
            "gcsFileFormat": "AVRO",
            "fileRotationMb": 5,
            "fileRotationInterval": "15s"
          }
        },
        "state": "CREATED"
      }
    }
    
  5. Modify this stream. To do this, first, set an UPDATE variable. This variable contains the values of the stream that you want to change. For this tutorial, change the maximum size of the files that can be transferred from the source database into a folder in the Cloud Storage destination bucket (from 5 MBytes to 100 MBytes).

    To set the variable, at the prompt, enter the following command:

    UPDATE="{\"destination_config\":{\"gcs_destination_config\":{\"file_rotation_mb\":100}}}"
    
  6. At the prompt, enter the following command:

    curl -X PATCH -d $UPDATE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]/?update_mask=destination_config.gcs_destination_config.file_rotation_mb
    
  7. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "update",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  8. Confirm that the stream is modified. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/operations/[STREAM_OPERATION_ID]
    
  9. Verify that the value of the fileRotationMb parameter for the Cloud Storage connection profile is now 100.

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "endTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "update",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": true,
      "response": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].Stream",
        "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "updateTime": "[DATE_AND_TIME_STAMP]",
        "displayName": "[DISPLAY_NAME]",
        "sourceConfig": {
          "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
          "oracleSourceConfig": {
            "allowlist": {
              "oracleSchemas": [
                {
                  "schemaName": "ROOT"
                }
              ]
            },
            "rejectlist": {}
          }
        },
        "destinationConfig": {
          "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
          "gcsDestinationConfig": {
            "gcsFileFormat": "AVRO",
            "fileRotationMb": 100,
            "fileRotationInterval": "15s"
          }
        },
        "state": "CREATED"
      }
    }
    
  10. Using the Fetch Errors API to retrieve any errors associated with the stream.

    1. At the prompt, enter the following command:

      curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]:fetchErrors
        

    2. Verify that you see the following lines of code:

        {
          "name": "[PROJECT_PATH]/operations/operation-[FETCH_ERRORS_OPERATION_ID]",
          "metadata": {
            "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
            "createTime": "[DATE_AND_TIME_STAMP]",
            "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
            "verb": "fetchErrors",
            "requestedCancellation": false,
            "apiVersion": "[DATASTREAM_VERSION]"
          },
          "done": false
        }
        

    3. At the prompt, enter the following command:

      curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/operations/operation-[FETCH_ERRORS_OPERATION_ID]
        

    4. Verify that you see the following lines of code:

        {
          "name": "[PROJECT_PATH]/operations/operation-[FETCH_ERRORS_OPERATION_ID]",
          "metadata": {
            "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
            "createTime": "[DATE_AND_TIME_STAMP]",
            "endTime": "[DATE_AND_TIME_STAMP]",
            "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
            "verb": "fetchErrors",
            "requestedCancellation": false,
            "apiVersion": "[DATASTREAM_VERSION]"
          },
          "done": true,
          "response": {
            "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].FetchErrorsResponse"
          }
        }
        

  11. Start the stream. At the prompt, enter the following command:

    curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]:start
    
  12. Verify that you see the following lines of code.

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "start",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  13. After a few seconds, retrieve information about the stream to confirm that it started.

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]
    
  14. Verify that the state of the stream has changed from CREATED to RUNNING.

    {
      "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
      "createTime": "[DATE_AND_TIME_STAMP]",
      "updateTime": "[DATE_AND_TIME_STAMP]",
      "displayName": "[DISPLAY_NAME]",
      "sourceConfig": {
        "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "oracleSourceConfig": {
          "allowlist": {
            "oracleSchemas": [
              {
                "schemaName": "ROOT"
              }
            ]
          },
          "rejectlist": {}
        }
      },
      "destinationConfig": {
        "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "gcsDestinationConfig": {
          "gcsFileFormat": "AVRO",
          "fileRotationMb": 100,
          "fileRotationInterval": "15s"
        }
      },
      "state": "RUNNING"
    }
    
  15. Pause the stream. At the prompt, enter the following command:

    curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]:pause
    
  16. Verify that you see the following lines of code.

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "start",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  17. Retrieve information about the stream to confirm that it is paused.

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]
    
  18. Verify that the state of the stream has changed from RUNNING to PAUSED.

    {
      "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
      "createTime": "[DATE_AND_TIME_STAMP]",
      "updateTime": "[DATE_AND_TIME_STAMP]",
      "displayName": "[DISPLAY_NAME]",
      "sourceConfig": {
        "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "oracleSourceConfig": {
          "allowlist": {
            "oracleSchemas": [
              {
                "schemaName": "ROOT"
              }
            ]
          },
          "rejectlist": {}
        }
      },
      "destinationConfig": {
        "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "gcsDestinationConfig": {
          "gcsFileFormat": "AVRO",
          "fileRotationMb": 100,
          "fileRotationInterval": "15s"
        }
      },
      "state": "PAUSED"
    }
    
  19. Resume the paused stream. At the prompt, enter the following command:

    curl -X POST -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]:resume
    
  20. Verify that you see the following lines of code.

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "start",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  21. Retrieve information about the stream to confirm that it's running again.

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]
    
  22. Verify that the state of the stream has changed from PAUSED back to RUNNING.

    {
      "name": "[PROJECT_PATH]/streams/[STREAM_ID]",
      "createTime": "[DATE_AND_TIME_STAMP]",
      "updateTime": "[DATE_AND_TIME_STAMP]",
      "displayName": "[DISPLAY_NAME]",
      "sourceConfig": {
        "sourceConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "oracleSourceConfig": {
          "allowlist": {
            "oracleSchemas": [
              {
                "schemaName": "ROOT"
              }
            ]
          },
          "rejectlist": {}
        }
      },
      "destinationConfig": {
        "destinationConnectionProfileName": "projects/[YOUR_PROJECT_NUMBER]/locations/[YOUR_PROJECT_LOCATION]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "gcsDestinationConfig": {
          "gcsFileFormat": "AVRO",
          "fileRotationMb": 100,
          "fileRotationInterval": "15s"
        }
      },
      "state": "RUNNING"
    }
    

Now that you created and managed a stream, confirmed that there are no errors associated with the stream, and that the state of the stream is RUNNING, you're ready to verify that it can transfer data from the source database into a folder in the Cloud Storage destination bucket.

Verifying the stream

In this section, you confirm that Datastream:

  • Transfers the data and tables associated with the ROOT schema of your source Oracle database into the /root/tutorial folder in the Cloud Storage destination bucket.
  • Translates the data into the Avro file format.
  1. Go to the Storage browser page in Cloud Storage.

    Go to the Storage browser page

  2. Click the link that contains your bucket.

  3. If the OBJECTS tab isn't active, then click it.

  4. Click the root folder, and then click the tutorial folder.

  5. Verify that you see folders that represent tables of the ROOT schema of your source Oracle database.

  6. Click one of the table folders, and then drill down until you see data that's associated with the table.

  7. Click a file that represents the data, and then click DOWNLOAD.

  8. Open this file in an Avro tool (for example, Avro Viewer) to ensure that the content is readable. This confirms that DataStream also translated the data into the Avro file format.

Clean up

After you've finished this tutorial, you can clean up the resources that you created on Datastream so they won't take up quota and you won't be billed for them in the future. The following sections describe how to delete or turn off these resources.

Delete the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Cloud Console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete the Cloud Storage destination bucket

  1. In the left-hand Navigation Drawer of Cloud Storage, click the Browser item.

  2. Select the check box to the left of your bucket, and then click DELETE.

  3. In the Delete bucket? window, enter the name of your bucket in the text field, and then click CONFIRM.

Delete the stream

  1. Make sure that your Google Cloud Shell application is active.

  2. At the prompt, enter the following command:

    curl -X DELETE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams/[STREAM_ID]
    
  3. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[STREAM_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/streams/[STREAM_ID]",
        "verb": "delete",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  4. Confirm that the stream is deleted. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/streams
    
  5. Verify that a null {} value is returned. This signifies that there aren't any streams in Datastream anymore, and the stream that you created is deleted.

Delete the connection profiles

  1. Delete the connection profile to the source Oracle database. At the prompt, enter the following command:

    curl -X DELETE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]
    
  2. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[SOURCE_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/connectionProfiles/[SOURCE_CONNECTION_PROFILE_ID]",
        "verb": "delete",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  3. Delete the connection profile to the destination bucket in Cloud Storage. At the prompt, enter the following command:

    curl -X DELETE -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]
    
  4. Verify that you see the following lines of code:

    {
      "name": "[PROJECT_PATH]/operations/operation-[DESTINATION_CONNECTION_PROFILE_OPERATION_ID]",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.datastream.[DATASTREAM_VERSION].OperationMetadata",
        "createTime": "[DATE_AND_TIME_STAMP]",
        "target": "[PROJECT_PATH]/connectionProfiles/[DESTINATION_CONNECTION_PROFILE_ID]",
        "verb": "delete",
        "requestedCancellation": false,
        "apiVersion": "[DATASTREAM_VERSION]"
      },
      "done": false
    }
    
  5. Confirm that both connection profiles are deleted. At the prompt, enter the following command:

    curl -H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" https://$ENV-datastream.googleapis.com/[DATASTREAM_VERSION]/[PROJECT_PATH]/connectionProfiles
    
  6. Verify that a null {} value is returned. This signifies that there aren't any connection profiles in Datastream anymore, and the profiles that you created are deleted.

What's next

  • Learn more about Datastream.
  • Try out other Google Cloud features for yourself. Have a look at our tutorials.