Quickstart using the gcloud command-line tool

This page explains how to use the Cloud Healthcare API with the gcloud command-line tool to complete the following tasks:

  1. Create a Cloud Healthcare API dataset.
  2. Create one of the following data stores inside the dataset:
    • Digital Imaging and Communications in Medicine (DICOM) store
    • Fast Healthcare Interoperability Resources (FHIR) store
    • Health Level Seven International Version 2 (HL7v2) store
  3. Store DICOM, FHIR, and HL7v2 data, and view DICOM metadata.

If you are only interested in working with one type of data store, you can skip directly to that section of the quickstart after completing the steps in Before you begin and Create a dataset.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  5. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  6. Enable the Cloud Healthcare API.

    Enable the API

Based on how you are using the gcloud command-line tool, complete one of the following steps:

  • If you are using Cloud Shell, go to the Google Cloud Console and then click the Activate Cloud Shell button in the console window.

    Go to the Google Cloud Console

    A Cloud Shell session opens inside a new frame in the console and displays a command-line prompt. The shell session might take a few minutes to initialize.

  • If you are using a Compute Engine virtual machine, open the virtual machine's terminal window.

  • If you are using the gcloud tool on your machine, install and initialize the Cloud SDK.

Create a dataset

Datasets contain data stores, and data stores contain healthcare data. To use the Cloud Healthcare API, you must create at least one dataset.

Create a dataset using the gcloud healthcare datasets create command:

gcloud healthcare datasets create my-dataset \
    --location=us-central1
    --project=PROJECT_ID

Replace PROJECT_ID with the ID of the Google Cloud project that you created or selected in Before you begin.

The output is the following:

Created dataset [my-dataset].

To complete this quickstart, choose from one of the following sections:

Store and view a DICOM instance

This section shows how to complete the following tasks:

  1. Create a DICOM store.
  2. Import a DICOM instance from Cloud Storage into the DICOM store.
  3. View the DICOM instance's metadata.

The Cloud Healthcare API implements the DICOMweb standard to store and access medical imaging data.

Create a DICOM store

DICOM stores exist inside datasets and contain DICOM instances.

Create a DICOM store using the gcloud healthcare dicom-stores create command:

gcloud healthcare dicom-stores create my-dicom-store \
  --dataset=my-dataset \
  --location=us-central1

The output is the following:

Created dicomStore [my-dicom-store].

Import a DICOM instance

Import the gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm DICOM instance using the gcloud healthcare dicom-stores import command:

gcloud healthcare dicom-stores import gcs my-dicom-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --gcs-uri=gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm

The output is the following:

Request issued for: [my-dicom-store]
Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset

In this output:

  • PROJECT_ID, us-central1, my-dataset: the values that you provided when running the command
  • OPERATION_ID: an identifier for the long-running operation provided by the Cloud Healthcare API

View DICOM instance metadata

The gcloud tool does not support DICOMweb transactions, such as viewing or retrieving instances. Instead, you can use the DICOMweb command-line tool from Google. The DICOMweb command-line tool runs using Python. For information on how to set up Python on Google Cloud, see Setting up a Python development environment.

View the DICOM instance metadata:

  1. After setting up Python, install the DICOMweb command-line tool using Pip:

    pip install https://github.com/GoogleCloudPlatform/healthcare-api-dicomweb-cli/archive/v1.0.zip
    
  2. View the DICOM instance's metadata:

    dcmweb \
      https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/dicomStores/my-dicom-store/dicomWeb \
      search instances
    

    Replace PROJECT_ID with the ID of the Google Cloud project that you created or selected in Before you begin.

    The output is the following:

    [
      {
        "00080016": {
          "Value": [
            "1.2.840.10008.5.1.4.1.1.7"
          ],
          "vr": "UI"
        },
        "00080018": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.153751009835107614666834563294684339746480"
          ],
          "vr": "UI"
        },
        "00080060": {
          "Value": [
            "DX"
          ],
          "vr": "CS"
        },
        "00100020": {
          "Value": [
            "1"
          ],
          "vr": "LO"
        },
        "00100040": {
          "Value": [
            "M"
          ],
          "vr": "CS"
        },
        "0020000D": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.111396399361969898205364400549799252857604"
          ],
          "vr": "UI"
        },
        "0020000E": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.195628213694300498946760767481291263511724"
          ],
          "vr": "UI"
        },
        "00280010": {
          "Value": [
            1024
          ],
          "vr": "US"
        },
        "00280011": {
          "Value": [
            1024
          ],
          "vr": "US"
        },
        "00280100": {
          "Value": [
            8
          ],
          "vr": "US"
        }
      }
    ]
    

Now that you've imported a DICOM instance into the Cloud Healthcare API and viewed its metadata, continue to What's next for information on next steps, such as how to search for or retrieve DICOM images.

Store FHIR resources

This section shows how to complete the following tasks:

  1. Create a FHIR store.
  2. Import FHIR resources from a Cloud Storage bucket into the FHIR store.

Create a FHIR store

FHIR stores exist inside datasets and contain FHIR resources.

Create a FHIR store using the gcloud healthcare fhir-stores create command:

gcloud healthcare fhir-stores create my-fhir-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --version=R4

The output is the following:

Created fhirStore [my-fhir-store].

Import FHIR resources

Import the FHIR resources from the gs://gcp-public-data--synthea-fhir-data-10-patients bucket into your FHIR store using the gcloud healthcare fhir-stores import command:

gcloud healthcare fhir-stores import gcs my-fhir-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --gcs-uri=gs://gcp-public-data--synthea-fhir-data-10-patients/fhir_r4_ndjson/*.ndjson \
  --content-structure=RESOURCE

The output is the following:

Request issued for: [my-fhir-store]
Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset
version: R4

In this output:

  • PROJECT_ID, us-central1, my-dataset: the values that you provided when running the command
  • OPERATION_ID: an identifier for the long-running operation provided by the Cloud Healthcare API

Now that you've imported FHIR resources into the Cloud Healthcare API, continue to What's next for information on next steps, such as how to view and search for FHIR resources in the FHIR store.

Store an HL7v2 message

This section shows how to complete the following tasks:

  1. Create an HL7v2 store.
  2. Create a Cloud Storage bucket and copy an HL7v2 message to the bucket.
  3. Import the HL7v2 message from the Cloud Storage bucket into the HL7v2 store.

The HL7v2 implementation in the Cloud Healthcare API aligns with the HL7v2 standard.

Create an HL7v2 store

HL7v2 stores exist inside datasets and contain HL7v2 messages.

Create an HL7v2 store using the gcloud healthcare hl7V2-stores create command:

gcloud healthcare hl7V2-stores create my-hl7v2-store \
  --dataset=my-dataset \
  --location=us-central1

The output is the following:

Created hl7v2Store [my-hl7v2-store].

Import an HL7v2 message

Store an HL7v2 sample message file in a Cloud Storage bucket and then import the sample message file into your HL7v2 store:

  1. Download the sample HL7v2 message file to your machine. The message contains the following basic information, where it is base-64 encoded in the data field of the sample file:

    MSH|^~\&|A|SEND_FACILITY|A|A|20180101000000||TYPE^A|20180101000000|T|0.0|||AA||00|ASCII
    EVN|A00|20180101040000
    PID||14^111^^^^MRN|11111111^^^^MRN~1111111111^^^^ORGNMBR
    
  2. If you don't already have a Cloud Storage bucket that you want to use to store the sample HL7v2 message, create a new bucket using the gsutil mb command:

    gsutil mb gs://BUCKET
    

    Replace BUCKET with your own globally unique bucket name.

    The output is the following:

    Creating gs://BUCKET/...
    

    If the bucket name you chose is already in use, either by you or someone else, the command returns the following:

    Creating gs://BUCKET/...
    ServiceException: 409 Bucket BUCKET already exists.
    

    If the bucket name is already in use, try again with a different bucket name.

  3. Copy the sample HL7v2 message to the bucket using the gsutil cp command:

    gsutil cp hl7v2-sample-import.ndjson gs://BUCKET
    

    Replace BUCKET with the bucket you created or selected in the previous step.

    The output is the following:

    Copying file://hl7v2-sample-import.ndjson [Content-Type=application/octet-stream]...
    / [1 files][  241.0 B/  241.0 B]
    Operation completed over 1 objects/241.0 B.
    
  4. After copying the HL7v2 file to the bucket, import the HL7v2 message using the gcloud beta healthcare hl7V2-stores import command:

    gcloud beta healthcare hl7v2-stores import gcs my-hl7v2-store \
     --dataset=my-dataset \
     --location=us-central1 \
     --gcs-uri=gs://BUCKET/hl7v2-sample-import.ndjson
    

    Replace BUCKET with the name of the Cloud Storage bucket that contains the HL7v2 file.

    The output is the following:

    Request issued for: [my-hl7v2-store]
    Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done.
    name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset
    

    In this output:

    • PROJECT_ID, us-central1, my-dataset: the values that you provided when running the command
    • OPERATION_ID: an identifier for the long-running operation provided by the Cloud Healthcare API

Now that you've imported an HL7v2 message into the Cloud Healthcare API, continue to What's next for information on next steps, such as how to view the contents of the HL7v2 message in its store.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this page, follow these steps.

If you created a new project for this quickstart, follow the steps in Delete the project. Otherwise, follow the steps in Delete the dataset.

Delete the project

  1. In the Cloud Console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete the dataset

If you no longer need the dataset created in this quickstart, you can delete it. Deleting a dataset permanently deletes the dataset and any FHIR, HL7v2, or DICOM stores it contains.

  1. To delete a dataset, use the gcloud healthcare datasets delete command:

    gcloud healthcare datasets delete my-dataset \
    --location=us-central1 \
    --project=PROJECT_ID
    

    Replace PROJECT_ID with the ID of the Google Cloud project that you created or selected in Before you begin.

  2. To confirm, type Y.

The output is the following:

Deleted dataset [my-dataset].

How did it go?

What's next

See the following sections for general information on the Cloud Healthcare API and how to perform tasks using the Cloud Console or curl and Windows PowerShell.

DICOM

Continue to the DICOM guide to read topics such as the following:

See the DICOM conformance statement for information on how the Cloud Healthcare API implements the DICOMweb standard.

FHIR

Continue to the FHIR guide to read topics such as the following:

See the FHIR conformance statement for information on how the Cloud Healthcare API implements the FHIR standard.

HL7v2

Continue to the HL7v2 guide to read topics such as the following: