Store healthcare data with the Google Cloud CLI

This page shows you how to use the Cloud Healthcare API and the Google Cloud CLI to complete the following tasks:

  1. Create a Cloud Healthcare API dataset.
  2. Create one of the following data stores inside the dataset:
    • Digital Imaging and Communications in Medicine (DICOM) store
    • Fast Healthcare Interoperability Resources (FHIR) store
    • Health Level Seven International Version 2 (HL7v2) store
  3. Store DICOM, FHIR, and HL7v2 data, and view DICOM metadata.

If you are only interested in working with one type of data store, you can skip directly to that section of the quickstart after completing the steps in Before you begin and Create a dataset.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project.

  4. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  5. Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project.

  6. Enable the Cloud Healthcare API.

    Enable the API

Based on how you are using the Google Cloud CLI, complete one of the following steps:

  • If you are using Cloud Shell, go to the Google Cloud console and then click the Activate Cloud Shell button in the console window.

    Go to the Google Cloud console

    A Cloud Shell session opens inside a new frame in the console and displays a command-line prompt. The shell session might take a few minutes to initialize.

  • If you are using a Compute Engine virtual machine, open the virtual machine's terminal window.

  • If you are using the gcloud CLI on your machine, install and initialize the gcloud CLI.

Create a dataset

Datasets contain data stores, and data stores contain healthcare data. To use the Cloud Healthcare API, you must create at least one dataset.

Create a dataset using the gcloud healthcare datasets create command:

gcloud healthcare datasets create my-dataset \
    --location=us-central1 \
    --project=PROJECT_ID

Replace PROJECT_ID with the ID of the Google Cloud project that you created or selected in Before you begin.

The output is the following:

Created dataset [my-dataset].

To complete this quickstart, choose from one of the following sections:

Store and view a DICOM instance

This section shows how to complete the following tasks:

  1. Create a DICOM store.
  2. Import a DICOM instance from Cloud Storage into the DICOM store.
  3. View the DICOM instance's metadata.

The Cloud Healthcare API implements the DICOMweb standard to store and access medical imaging data.

Create a DICOM store

DICOM stores exist inside datasets and contain DICOM instances.

Create a DICOM store using the gcloud healthcare dicom-stores create command:

gcloud healthcare dicom-stores create my-dicom-store \
  --dataset=my-dataset \
  --location=us-central1

The output is the following:

Created dicomStore [my-dicom-store].

Import a DICOM instance

Import the gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm DICOM instance using the gcloud healthcare dicom-stores import command:

gcloud healthcare dicom-stores import gcs my-dicom-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --gcs-uri=gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm

The output is the following:

Request issued for: [my-dicom-store]
Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset

In this output:

  • PROJECT_ID, us-central1, my-dataset: the values that you provided when running the command
  • OPERATION_ID: an identifier for the long-running operation provided by the Cloud Healthcare API

View DICOM instance metadata

The gcloud CLI does not support DICOMweb transactions, such as viewing or retrieving instances. Instead, you can use the DICOMweb command-line tool from Google. The DICOMweb command-line tool runs using Python. For information on how to set up Python on Google Cloud, see Setting up a Python development environment.

View the DICOM instance metadata:

  1. After setting up Python, install the DICOMweb command-line tool using Pip:

    pip install https://github.com/GoogleCloudPlatform/healthcare-api-dicomweb-cli/archive/v1.0.zip
    
  2. Update the PATH variable to include the dcmweb install location:

    export PATH="$HOME/bin:$PATH"
    
  3. View the DICOM instance's metadata:

    dcmweb \
      https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/dicomStores/my-dicom-store/dicomWeb \
      search instances
    

    Replace PROJECT_ID with the ID of the Google Cloud project that you created or selected in Before you begin.

    The output is the following:

    [
      {
        "00080016": {
          "Value": [
            "1.2.840.10008.5.1.4.1.1.7"
          ],
          "vr": "UI"
        },
        "00080018": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.153751009835107614666834563294684339746480"
          ],
          "vr": "UI"
        },
        "00080060": {
          "Value": [
            "DX"
          ],
          "vr": "CS"
        },
        "00100020": {
          "Value": [
            "1"
          ],
          "vr": "LO"
        },
        "00100040": {
          "Value": [
            "M"
          ],
          "vr": "CS"
        },
        "0020000D": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.111396399361969898205364400549799252857604"
          ],
          "vr": "UI"
        },
        "0020000E": {
          "Value": [
            "1.3.6.1.4.1.11129.5.5.195628213694300498946760767481291263511724"
          ],
          "vr": "UI"
        },
        "00280010": {
          "Value": [
            1024
          ],
          "vr": "US"
        },
        "00280011": {
          "Value": [
            1024
          ],
          "vr": "US"
        },
        "00280100": {
          "Value": [
            8
          ],
          "vr": "US"
        }
      }
    ]
    

Now that you've imported a DICOM instance into the Cloud Healthcare API and viewed its metadata, continue to Clean up to avoid incurring charges to your Google Cloud account for the resources used in this page.

For information on next steps, such as how to search for or retrieve DICOM images, see What's next.

Store FHIR resources

This section shows how to complete the following tasks:

  1. Create a FHIR store.
  2. Import FHIR resources from a Cloud Storage bucket into the FHIR store.

Create a FHIR store

FHIR stores exist inside datasets and contain FHIR resources.

Create a FHIR store using the gcloud healthcare fhir-stores create command:

gcloud healthcare fhir-stores create my-fhir-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --version=R4

The output is the following:

Created fhirStore [my-fhir-store].

Import FHIR resources

Import the FHIR resources from the gs://gcp-public-data--synthea-fhir-data-10-patients bucket into your FHIR store using the gcloud healthcare fhir-stores import command:

gcloud healthcare fhir-stores import gcs my-fhir-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --gcs-uri=gs://gcp-public-data--synthea-fhir-data-10-patients/fhir_r4_ndjson/*.ndjson \
  --content-structure=RESOURCE

The output is the following:

Request issued for: [my-fhir-store]
Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset
version: R4

In this output:

  • PROJECT_ID, us-central1, my-dataset: the values that you provided when running the command
  • OPERATION_ID: an identifier for the long-running operation provided by the Cloud Healthcare API

Now that you've imported FHIR resources into the Cloud Healthcare API, continue to Clean up to avoid incurring charges to your Google Cloud account for the resources used in this page.

For information on next steps, such as how to view and search for FHIR resources, see What's next.

Store an HL7v2 message

This section shows how to complete the following tasks:

  1. Create an HL7v2 store.
  2. Create a Cloud Storage bucket and copy an HL7v2 message to the bucket.
  3. Import the HL7v2 message from the Cloud Storage bucket into the HL7v2 store.

The HL7v2 implementation in the Cloud Healthcare API aligns with the HL7v2 standard.

Create an HL7v2 store

HL7v2 stores exist inside datasets and contain HL7v2 messages.

Create an HL7v2 store using the gcloud healthcare hl7v2-stores create command:

gcloud healthcare hl7v2-stores create my-hl7v2-store \
  --dataset=my-dataset \
  --location=us-central1

The output is the following:

Created hl7v2Store [my-hl7v2-store].

Import HL7v2 messages

Import the HL7v2 messages in gs://cloud-samples-data/healthcare/hl7v2/messages.ndjson into your HL7v2 store using the gcloud healthcare hl7v2-stores import command:

gcloud beta healthcare hl7v2-stores import gcs my-hl7v2-store \
  --dataset=my-dataset \
  --location=us-central1 \
  --gcs-uri=gs://cloud-samples-data/healthcare/hl7v2/messages.ndjson

The output is the following:

Request issued for: [my-hl7v2-store]
Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/hl7V2Stores/my-hl7v2-store

In this output:

  • PROJECT_ID, us-central1, my-dataset, my-hl7v2-store: the values that you provided when running the command
  • OPERATION_ID: an identifier for the long-running operation provided by the Cloud Healthcare API

Now that you've imported an HL7v2 message into the Cloud Healthcare API, continue to Clean up to avoid incurring charges to your Google Cloud account for the resources used in this page.

For information on next steps, such as how to view the contents of an HL7v2 message, see What's next.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.

If you created a new project for this quickstart, follow the steps in Delete the project. Otherwise, follow the steps in Delete the dataset.

Delete the project

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete the dataset

If you no longer need the dataset created in this quickstart, you can delete it. Deleting a dataset permanently deletes the dataset and any FHIR, HL7v2, or DICOM stores it contains.

  1. To delete a dataset, use the gcloud healthcare datasets delete command:

    gcloud healthcare datasets delete my-dataset \
    --location=us-central1 \
    --project=PROJECT_ID
    

    Replace PROJECT_ID with the ID of the Google Cloud project that you created or selected in Before you begin.

  2. To confirm, type Y.

The output is the following:

Deleted dataset [my-dataset].

How did it go?

What's next

See the following sections for general information on the Cloud Healthcare API and how to perform tasks using the Google Cloud console or curl and Windows PowerShell.

DICOM

Continue to the DICOM guide to read topics such as the following:

See the DICOM conformance statement for information on how the Cloud Healthcare API implements the DICOMweb standard.

FHIR

Continue to the FHIR guide to read topics such as the following:

See the FHIR conformance statement for information on how the Cloud Healthcare API implements the FHIR standard.

HL7v2

Continue to the HL7v2 guide to read topics such as the following: