Quickstart using the gcloud command-line tool

This page explains how to use the Cloud Healthcare API with the gcloud command-line tool to complete the following tasks:

  1. Create a Cloud Healthcare API dataset.
  2. Create one of the following data stores inside the dataset:
    • Digital Imaging and Communications in Medicine (DICOM) store
    • Fast Healthcare Interoperability Resources (FHIR) store
    • Health Level Seven International Version 2 (HL7v2) store
  3. Store DICOM, FHIR, and HL7v2 data, and view DICOM metadata.

If you are only interested in working with one type of data store, you can skip directly to that section of the quickstart after completing the steps in Before you begin and Create a dataset.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. Enable the Cloud Healthcare API.

    Enable the API

  5. Based on how you are using the gcloud command-line tool, complete one of the following steps:
    • If you are using Cloud Shell, go to Google Cloud Console and then click the Activate Cloud Shell button at the top of the console window.

      Go to Google Cloud Console

      A Cloud Shell session opens inside a new frame at the bottom of the console and displays a command-line prompt. It can take a few seconds for the shell session to initialize.

    • If you are using a Compute Engine virtual machine, open the virtual machine's terminal window.
    • If you are using the gcloud tool on your machine, install and initialize the Cloud SDK.

Create a dataset

Datasets contain data stores, and data stores contain healthcare data. To use the Cloud Healthcare API, you must create at least one dataset.

Create a dataset using the gcloud healthcare datasets create command:

gcloud healthcare datasets create DATASET_ID \
    --location=LOCATION

Replace the following:

  • DATASET_ID: an identifier for the dataset. The dataset ID must have the following:
    • A unique ID in its location
    • A Unicode string of 1-256 characters consisting of the following:
      • Numbers
      • Letters
      • Underscores
      • Dashes
      • Periods
  • LOCATION: the location of the dataset. Use us-central1, us-west2, us-east4, europe-west2, europe-west3, europe-west4, europe-west6, northamerica-northeast1, southamerica-east1, asia-east2, asia-northeast1, asia-northeast3, asia-south1, asia-southeast1, australia-southeast1, or us.

The output is the following:

Created dataset [DATASET_ID].

To complete this quickstart, choose from one of the following sections:

Store and view a DICOM instance

This section shows how to complete the following tasks:

  1. Create a DICOM store.
  2. Import a DICOM instance from Cloud Storage into the DICOM store.
  3. View the DICOM instance's metadata.

The Cloud Healthcare API implements the DICOMweb standard to store and access medical imaging data.

Create a DICOM store

DICOM stores exist inside datasets and contain DICOM instances.

Create a DICOM store using the gcloud healthcare dicom-stores create command:

gcloud healthcare dicom-stores create DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION

Replace the following:

  • DICOM_STORE_ID: an identifier for the DICOM store. The DICOM store ID must have the following:
    • A unique ID in its dataset
    • A Unicode string of 1-256 characters consisting of the following:
      • Numbers
      • Letters
      • Underscores
      • Dashes
      • Periods
  • DATASET_ID: the dataset ID
  • LOCATION: the dataset location

The output is the following:

Created dicomStore [DICOM_STORE_ID].

Import a DICOM instance

Sample DICOM data is available in the gs://gcs-public-data--healthcare-nih-chest-xray Cloud Storage bucket.

Import the gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm DICOM instance using the gcloud healthcare dicom-stores import command:

gcloud healthcare dicom-stores import gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri=gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm

Replace the following:

  • DICOM_STORE_ID: the DICOM store ID
  • DATASET_ID: the dataset ID
  • LOCATION: the dataset location

The output is the following:

Request issued for: [DICOM_STORE_ID]
Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID

View DICOM instance metadata

The gcloud tool does not support DICOMweb transactions, such as viewing or retrieving instances. Instead, you can use the DICOMweb command-line tool from Google. The DICOMweb command-line tool runs using Python. For information on how to set up Python on Google Cloud, see Setting up a Python development environment.

View the DICOM instance metadata:

  1. After setting up Python, install the DICOMweb command-line tool using Pip:

    pip install https://github.com/GoogleCloudPlatform/healthcare-api-dicomweb-cli/archive/v1.0.zip
    
  2. View the DICOM instance's metadata:

    dcmweb \
      https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb \
      search instances
    

    Replace the following:

    • DICOM_STORE_ID: the DICOM store ID
    • DATASET_ID: the dataset ID
    • LOCATION: the dataset location

    The output is the following:

    [
      {
        "00080016": {
          "Value": [
            "1.2.840.10008.5.1.4.1.1.7"
          ],
          "vr": "UI"
        },
        "00080018": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.153751009835107614666834563294684339746480"
          ],
          "vr": "UI"
        },
        "00080060": {
          "Value": [
            "DX"
          ],
          "vr": "CS"
        },
        "00100020": {
          "Value": [
            "1"
          ],
          "vr": "LO"
        },
        "00100040": {
          "Value": [
            "M"
          ],
          "vr": "CS"
        },
        "0020000D": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.111396399361969898205364400549799252857604"
          ],
          "vr": "UI"
        },
        "0020000E": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.195628213694300498946760767481291263511724"
          ],
          "vr": "UI"
        },
        "00280010": {
          "Value": [
            1024
          ],
          "vr": "US"
        },
        "00280011": {
          "Value": [
            1024
          ],
          "vr": "US"
        },
        "00280100": {
          "Value": [
            8
          ],
          "vr": "US"
        }
      }
    ]
    

Now that you've stored a DICOM instance in the Cloud Healthcare API and viewed its metadata, continue to What's next for information on next steps, such as how to search for or retrieve DICOM images.

Store FHIR resources

This section shows how to complete the following tasks:

  1. Create a FHIR store.
  2. Import FHIR resources from a Cloud Storage bucket into the FHIR store.

Create a FHIR store

FHIR stores exist inside datasets and contain FHIR resources.

Create a FHIR store using the gcloud healthcare fhir-stores create command:

gcloud healthcare fhir-stores create FHIR_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --version=R4

Replace the following:

  • FHIR_STORE_ID: an identifier for the FHIR store. The FHIR store ID must have the following:
    • A unique ID in its dataset
    • A Unicode string of 1-256 characters consisting of the following:
      • Numbers
      • Letters
      • Underscores
      • Dashes
      • Periods
  • DATASET_ID: the dataset ID
  • LOCATION: the dataset location

The output is the following:

Created fhirStore [FHIR_STORE_ID].

Import FHIR resources

Sample FHIR data is available in the gs://gcp-public-data--synthea-fhir-data-10-patients Cloud Storage bucket. The bucket contains a directory, fhir_r4_ndjson/, which contains several types of FHIR resources.

Import the FHIR resources from the bucket into your FHIR store using the gcloud healthcare fhir-stores import command:

gcloud healthcare fhir-stores import gcs FHIR_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri=gs://gcp-public-data--synthea-fhir-data-10-patients/fhir_r4_ndjson/*.ndjson \
  --content-structure=RESOURCE

Replace the following:

  • FHIR_STORE_ID: the FHIR store ID
  • DATASET_ID: the dataset ID
  • LOCATION: the dataset location

The output is the following:

Request issued for: [FHIR_STORE_ID]
Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID
version: R4

Now that you've stored a FHIR resource in the Cloud Healthcare API, continue to What's next for information on next steps, such as how to view and search for FHIR resources in the FHIR store.

Store an HL7v2 message

This section shows how to complete the following tasks:

  1. Create an HL7v2 store.
  2. Create a Cloud Storage bucket and copy an HL7v2 message to the bucket.
  3. Import the HL7v2 message from the Cloud Storage bucket into the HL7v2 store.

The HL7v2 implementation in the Cloud Healthcare API aligns with the HL7v2 standard.

Create an HL7v2 store

HL7v2 stores exist inside datasets and contain HL7v2 messages.

Create an HL7v2 store using the gcloud healthcare hl7V2-stores create command:

gcloud healthcare hl7V2-stores create HL7V2_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION

Replace the following:

  • HL7V2_STORE_ID: an identifier for the HL7v2 store. The HL7v2 store ID must have the following:
    • A unique ID in its dataset
    • A Unicode string of 1-256 characters consisting of the following:
      • Numbers
      • Letters
      • Underscores
      • Dashes
      • Periods
  • DATASET_ID: the dataset ID
  • LOCATION: the dataset location

The output is the following:

Created hl7v2Store [HL7V2_STORE_ID].

Import an HL7v2 message

Store an HL7v2 sample message file in a Cloud Storage bucket and then import the sample message file into your HL7v2 store:

  1. Download the sample HL7v2 message file to your machine. The message contains the following basic information, where it is base-64 encoded in the data field of the sample file:

    MSH|^~\&|A|SEND_FACILITY|A|A|20180101000000||TYPE^A|20180101000000|T|0.0|||AA||00|ASCII
    EVN|A00|20180101040000
    PID||14^111^^^^MRN|11111111^^^^MRN~1111111111^^^^ORGNMBR
    
  2. If you don't already have a Cloud Storage bucket that you want to use to store the sample HL7v2 message, create a new bucket using the gsutil mb command:

    gsutil mb gs://BUCKET
    

    Replace the BUCKET variable with your own globally unique bucket name.

    The output is the following:

    Creating gs://BUCKET/...
    

    If the bucket name you chose is already in use, either by you or someone else, the command returns the following:

    Creating gs://BUCKET/...
    ServiceException: 409 Bucket BUCKET already exists.
    

    If the bucket name is already in use, try again with a different bucket name.

  3. Copy the sample HL7v2 message to the bucket using the gsutil cp command:

    gsutil cp hl7v2-sample-import.ndjson gs://BUCKET
    

    The output is the following:

    Copying file://hl7v2-sample-import.ndjson [Content-Type=application/octet-stream]...
    / [1 files][  241.0 B/  241.0 B]
    Operation completed over 1 objects/241.0 B.
    
  4. After copying the HL7v2 file to the bucket, import the HL7v2 message using the gcloud beta healthcare hl7V2-stores import command:

    gcloud beta healthcare hl7v2-stores import gcs HL7V2_STORE_ID \
     --dataset=DATASET_ID \
     --location=LOCATION \
     --gcs-uri=gs://BUCKET/hl7v2-sample-import.ndjson
    

    Replace the following:

    • HL7V2_STORE_ID: the HL7v2 store ID
    • DATASET_ID: the dataset ID
    • LOCATION: the dataset location
    • BUCKET: the name of the Cloud Storage bucket that contains the HL7v2 file

    The output is the following:

    Request issued for: [HL7V2_STORE_ID]
    Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done.
    name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID
    

Now that you've stored an HL7v2 message in the Cloud Healthcare API, continue to What's next for information on next steps, such as how to view the contents of the HL7v2 message in its store.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, you can clean up the resources you created on Google Cloud.

If you created a new project for this tutorial, follow the steps in Delete the project. Otherwise, follow the steps in Delete the dataset.

Delete the project

  1. In the Cloud Console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete the dataset

If you no longer need the dataset created in this quickstart, you can delete it. Deleting a dataset permanently deletes the dataset and any FHIR, HL7v2, or DICOM stores it contains.

  1. To delete a dataset, use the gcloud healthcare datasets delete command:

    gcloud healthcare datasets delete DATASET_ID \
    --location=LOCATION \
    --project=PROJECT_ID
    
  2. To confirm, type Y.

The output is the following:

Deleted dataset [DATASET_ID].

How did it go?

What's next

See the following sections for general information on the Cloud Healthcare API and how to perform tasks using the Cloud Console or curl and Windows PowerShell.

DICOM

Continue to the DICOM guide to review topics such as:

See the DICOM conformance statement for information on how the Cloud Healthcare API implements the DICOMweb standard.

FHIR

Continue to the FHIR guide to review topics such as:

See the FHIR conformance statement for information on how the Cloud Healthcare API implements the FHIR standard.

HL7v2

Continue to the HL7v2 guide to review topics such as: