Store healthcare data with the Google Cloud CLI
This page shows you how to use the Cloud Healthcare API and the Google Cloud CLI to complete the following tasks:
- Create a Cloud Healthcare API dataset.
- Create one of the following data stores inside the dataset:
- Digital Imaging and Communications in Medicine (DICOM) store
- Fast Healthcare Interoperability Resources (FHIR) store
- Health Level Seven International Version 2 (HL7v2) store
- Store DICOM, FHIR, and HL7v2 data, and view DICOM metadata.
If you're only interested in working with one type of data store, you can skip directly to that section of the quickstart after completing the steps in Before you begin and Create a dataset.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
- Install the Google Cloud CLI.
-
To initialize the gcloud CLI, run the following command:
gcloud init
-
Create or select a Google Cloud project.
-
Create a Google Cloud project:
gcloud projects create PROJECT_ID
Replace
PROJECT_ID
with a name for the Google Cloud project you are creating. -
Select the Google Cloud project that you created:
gcloud config set project PROJECT_ID
Replace
PROJECT_ID
with your Google Cloud project name.
-
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the Cloud Healthcare API:
gcloud services enable healthcare.googleapis.com
-
Grant roles to your user account. Run the following command once for each of the following IAM roles:
roles/healthcare.datasetAdmin, roles/healthcare.fhirStoreAdmin, roles/healthcare.dicomStoreAdmin, roles/healthcare.hl7V2StoreAdmin
gcloud projects add-iam-policy-binding PROJECT_ID --member="USER_IDENTIFIER" --role=ROLE
- Replace
PROJECT_ID
with your project ID. -
Replace
USER_IDENTIFIER
with the identifier for your user account. For example,user:myemail@example.com
. - Replace
ROLE
with each individual role.
- Replace
- Install the Google Cloud CLI.
-
To initialize the gcloud CLI, run the following command:
gcloud init
-
Create or select a Google Cloud project.
-
Create a Google Cloud project:
gcloud projects create PROJECT_ID
Replace
PROJECT_ID
with a name for the Google Cloud project you are creating. -
Select the Google Cloud project that you created:
gcloud config set project PROJECT_ID
Replace
PROJECT_ID
with your Google Cloud project name.
-
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the Cloud Healthcare API:
gcloud services enable healthcare.googleapis.com
-
Grant roles to your user account. Run the following command once for each of the following IAM roles:
roles/healthcare.datasetAdmin, roles/healthcare.fhirStoreAdmin, roles/healthcare.dicomStoreAdmin, roles/healthcare.hl7V2StoreAdmin
gcloud projects add-iam-policy-binding PROJECT_ID --member="USER_IDENTIFIER" --role=ROLE
- Replace
PROJECT_ID
with your project ID. -
Replace
USER_IDENTIFIER
with the identifier for your user account. For example,user:myemail@example.com
. - Replace
ROLE
with each individual role.
- Replace
Create a dataset
Datasets contain data stores, and data stores contain healthcare data. To use the Cloud Healthcare API, you must create at least one dataset.
The following sample shows how to create a dataset named my-dataset
in the
us-central1
region. You use the dataset throughout this quickstart to
create DICOM stores, FHIR stores, and HL7v2 stores.
gcloud
Create a dataset using the
gcloud healthcare datasets create
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare datasets create my-dataset \ --project=PROJECT_ID \ --location=us-central1
Windows (PowerShell)
gcloud healthcare datasets create my-dataset ` --project=PROJECT_ID ` --location=us-central1
Windows (cmd.exe)
gcloud healthcare datasets create my-dataset ^ --project=PROJECT_ID ^ --location=us-central1
You should receive a response similar to the following:
Response
Create request issued for: [my-dataset] Created dataset [my-dataset].
To complete this quickstart, choose from one of the following sections:
Store and view a DICOM instance
This section shows how to complete the following tasks:
- Create a DICOM store.
- Import a DICOM instance from a public Cloud Storage bucket into the DICOM store.
- View the DICOM instance's metadata.
The Cloud Healthcare API implements the DICOMweb standard to store and access medical imaging data.
Create a DICOM store
DICOM stores exist inside datasets and contain DICOM instances. The following
sample shows how to create a DICOM store named my-dicom-store
.
gcloud
Create
a DICOM store using the gcloud healthcare dicom-stores create
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare dicom-stores create my-dicom-store \ --project=PROJECT_ID \ --dataset=my-dataset \ --location=us-central1
Windows (PowerShell)
gcloud healthcare dicom-stores create my-dicom-store ` --project=PROJECT_ID ` --dataset=my-dataset ` --location=us-central1
Windows (cmd.exe)
gcloud healthcare dicom-stores create my-dicom-store ^ --project=PROJECT_ID ^ --dataset=my-dataset ^ --location=us-central1
You should receive a response similar to the following:
Response
Created dicomStore [my-dicom-store].
Import a DICOM instance
Sample DICOM data is available in the gs://gcs-public-data--healthcare-nih-chest-xray
Cloud Storage bucket.
gcloud
Import the gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm
instance using the
gcloud healthcare dicom-stores import
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare dicom-stores import gcs my-dicom-store \ --project=PROJECT_ID \ --dataset=my-dataset \ --location=us-central1 \ --gcs-uri=gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm
Windows (PowerShell)
gcloud healthcare dicom-stores import gcs my-dicom-store ` --project=PROJECT_ID ` --dataset=my-dataset ` --location=us-central1 ` --gcs-uri=gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm
Windows (cmd.exe)
gcloud healthcare dicom-stores import gcs my-dicom-store ^ --project=PROJECT_ID ^ --dataset=my-dataset ^ --location=us-central1 ^ --gcs-uri=gs://gcs-public-data--healthcare-nih-chest-xray/dicom/00000001_000.dcm
In this output:
PROJECT_ID
,us-central1
,my-dataset
,my-dicom-store
: the values you provided when running the commandOPERATION_ID
: an identifier for the long-running operation provided by the Cloud Healthcare API when you import a DICOM instance. Long-running operations are returned when method calls might take a long time to complete. Importing one DICOM instance is usually a quick operation, so the output is returned almost immediately.
Response
Request issued for: [my-dicom-store] Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/dicomStores/my-dicom-store
View DICOM instance metadata
The gcloud CLI doesn't support DICOMweb transactions, such as viewing or retrieving instances. Instead, you can use the DICOMweb command-line tool from Google, which runs using Python. For information on how to set up Python on Google Cloud, see Setting up a Python development environment.
Complete the following steps to view the DICOM instance metadata using the DICOMweb command-line tool:
Install the DICOMweb command-line tool using Pip:
pip install https://github.com/GoogleCloudPlatform/healthcare-api-dicomweb-cli/archive/v1.0.zip
Update the
PATH
variable to include thedcmweb
install location:export PATH="$HOME/bin:$PATH"
View the DICOM instance's metadata:
dcmweb \ https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/dicomStores/my-dicom-store/dicomWeb \ search instances
Replace
PROJECT_ID
with the ID of the Google Cloud project that you created or selected in Before you begin.The output is the following. See Attributes of the SOP Common Module for the fields in the output.
[ { "00080016": { "Value": [ "1.2.840.10008.5.1.4.1.1.7" ], "vr": "UI" }, "00080018": { "Value": [ "1.3.6.1.4.1.11129.5.5.153751009835107614666834563294684339746480" ], "vr": "UI" }, "00080060": { "Value": [ "DX" ], "vr": "CS" }, "00100020": { "Value": [ "1" ], "vr": "LO" }, "00100040": { "Value": [ "M" ], "vr": "CS" }, "0020000D": { "Value": [ "1.3.6.1.4.1.11129.5.5.111396399361969898205364400549799252857604" ], "vr": "UI" }, "0020000E": { "Value": [ "1.3.6.1.4.1.11129.5.5.195628213694300498946760767481291263511724" ], "vr": "UI" }, "00280010": { "Value": [ 1024 ], "vr": "US" }, "00280011": { "Value": [ 1024 ], "vr": "US" }, "00280100": { "Value": [ 8 ], "vr": "US" } } ]
After importing the DICOM instance into the Cloud Healthcare API and viewing its metadata, continue to Clean up to avoid incurring charges to your Google Cloud account for the resources used in this page.
For information on next steps, such as how to search for or retrieve DICOM images using the DICOMweb standard in the Cloud Healthcare API, see What's next.
Store FHIR resources
This section shows how to complete the following tasks:
- Create a FHIR store.
- Import FHIR resources from a public Cloud Storage bucket into the FHIR store.
Create a FHIR store
FHIR stores exist inside datasets and contain FHIR resources. The following
sample shows how to create a FHIR store named my-fhir-store
that uses
FHIR version R4.
gcloud
Create
a FHIR store using the gcloud healthcare fhir-stores create
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare fhir-stores create my-fhir-store \ --project=PROJECT_ID \ --dataset=my-dataset \ --location=us-central1 \ --version=R4
Windows (PowerShell)
gcloud healthcare fhir-stores create my-fhir-store ` --project=PROJECT_ID ` --dataset=my-dataset ` --location=us-central1 ` --version=R4
Windows (cmd.exe)
gcloud healthcare fhir-stores create my-fhir-store ^ --project=PROJECT_ID ^ --dataset=my-dataset ^ --location=us-central1 ^ --version=R4
You should receive a response similar to the following:
Response
Created fhirStore [my-fhir-store].
Import FHIR resources
Sample FHIR data is available in the gs://gcp-public-data--synthea-fhir-data-10-patients
Cloud Storage bucket.
gcloud
Import the FHIR resources in gs://gcp-public-data--synthea-fhir-data-10-patients
using the gcloud healthcare fhir-stores import
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare fhir-stores import gcs my-fhir-store \ --project=PROJECT_ID \ --dataset=my-dataset \ --location=us-central1 \ --gcs-uri=gs://gcp-public-data--synthea-fhir-data-10-patients/fhir_r4_ndjson/*.ndjson \ --content-structure=RESOURCE
Windows (PowerShell)
gcloud healthcare fhir-stores import gcs my-fhir-store ` --project=PROJECT_ID ` --dataset=my-dataset ` --location=us-central1 ` --gcs-uri=gs://gcp-public-data--synthea-fhir-data-10-patients/fhir_r4_ndjson/*.ndjson ` --content-structure=RESOURCE
Windows (cmd.exe)
gcloud healthcare fhir-stores import gcs my-fhir-store ^ --project=PROJECT_ID ^ --dataset=my-dataset ^ --location=us-central1 ^ --gcs-uri=gs://gcp-public-data--synthea-fhir-data-10-patients/fhir_r4_ndjson/*.ndjson ^ --content-structure=RESOURCE
In this output:
PROJECT_ID
,us-central1
,my-dataset
,my-fhir-store
: the values you provided when running the commandOPERATION_ID
: an identifier for the long-running operation provided by the Cloud Healthcare API when you import a FHIR resource. Long-running operations are returned when method calls might take a long time to complete. Importing the FHIR resources takes about one to two minutes.R4
: the FHIR store version
Response
Request issued for: [my-fhir-store] Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/fhirStores/my-fhir-store version: R4
After importing the FHIR resources into the Cloud Healthcare API, continue to Clean up to avoid incurring charges to your Google Cloud account for the resources used in this page.
For information on next steps, such as how to view and search for FHIR resources, see What's next.
Store an HL7v2 message
This section shows how to complete the following tasks:
- Create an HL7v2 store.
- Import an HL7v2 message from a public Cloud Storage bucket into the HL7v2 store.
The HL7v2 implementation in the Cloud Healthcare API aligns with the HL7v2 standard.
Create an HL7v2 store
HL7v2 stores exist inside datasets and contain HL7v2 messages. The following
sample shows how to create an HL7v2 store named my-hl7v2-store
.
gcloud
Create an HL7v2 store using the gcloud healthcare hl7v2-stores create
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare hl7v2-stores create my-hl7v2-store \ --project=PROJECT_ID \ --dataset=my-dataset \ --location=us-central1
Windows (PowerShell)
gcloud healthcare hl7v2-stores create my-hl7v2-store ` --project=PROJECT_ID ` --dataset=my-dataset ` --location=us-central1
Windows (cmd.exe)
gcloud healthcare hl7v2-stores create my-hl7v2-store ^ --project=PROJECT_ID ^ --dataset=my-dataset ^ --location=us-central1
You should receive a response similar to the following:
Response
Created hl7v2Store [my-hl7v2-store].
Import HL7v2 messages
gcloud
Import the gs://cloud-samples-data/healthcare/hl7v2/messages.ndjson
HL7v2 message using the
gcloud healthcare hl7v2-stores import
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare hl7v2-stores import gcs my-hl7v2-store \ --project=PROJECT_ID \ --dataset=my-dataset \ --location=us-central1 \ --gcs-uri=gs://cloud-samples-data/healthcare/hl7v2/messages.ndjson
Windows (PowerShell)
gcloud healthcare hl7v2-stores import gcs my-hl7v2-store ` --project=PROJECT_ID ` --dataset=my-dataset ` --location=us-central1 ` --gcs-uri=gs://cloud-samples-data/healthcare/hl7v2/messages.ndjson
Windows (cmd.exe)
gcloud healthcare hl7v2-stores import gcs my-hl7v2-store ^ --project=PROJECT_ID ^ --dataset=my-dataset ^ --location=us-central1 ^ --gcs-uri=gs://cloud-samples-data/healthcare/hl7v2/messages.ndjson
In this output:
PROJECT_ID
,us-central1
,my-dataset
,my-hl7v2-store
: the values you provided when running the commandOPERATION_ID
: an identifier for the long-running operation provided by the Cloud Healthcare API when you import an HL7v2 message. Long-running operations are returned when method calls might take a long time to complete. Importing one HL7v2 message is usually a quick operation, so the output is returned almost immediately.
Response
Request issued for: [my-hl7v2-store] Waiting for operation [projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/operations/OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/us-central1/datasets/my-dataset/hl7V2Stores/my-hl7v2-store
After importing the HL7v2 message into the Cloud Healthcare API, continue to Clean up to avoid incurring charges to your Google Cloud account for the resources used in this page.
For information on next steps, such as how to view the contents of an HL7v2 message, see What's next.
Clean up
To avoid incurring charges to your Google Cloud account for the resources used on this page, delete the Google Cloud project with the resources.
If you created a new project for this quickstart, follow the steps in Delete the project. Otherwise, follow the steps in Delete the dataset.
Optional: Revoke credentials from the gcloud CLI.
gcloud auth revoke
Delete the project
Delete a Google Cloud project:
gcloud projects delete PROJECT_ID
Delete the dataset
If you no longer need the dataset created in this quickstart, you can delete it. Deleting a dataset permanently deletes the dataset and any FHIR, HL7v2, or DICOM stores it contains.
gcloud
Delete a dataset using the
gcloud healthcare datasets delete
command.
Before using any of the command data below, make the following replacements:
PROJECT_ID
: the ID of the Google Cloud project that you created or selected in Before you begin
Execute the following command:
Linux, macOS, or Cloud Shell
gcloud healthcare datasets delete my-dataset \ --project=PROJECT_ID \ --location=us-central1
Windows (PowerShell)
gcloud healthcare datasets delete my-dataset ` --project=PROJECT_ID ` --location=us-central1
Windows (cmd.exe)
gcloud healthcare datasets delete my-dataset ^ --project=PROJECT_ID ^ --location=us-central1
Response
You are about to delete dataset [my-dataset] Do you want to continue (Y/n)? Y Deleted dataset [my-dataset].
How did it go?
What's next
See the following sections for general information on the Cloud Healthcare API and how to perform the tasks in this quickstart using another interface:
- Read an overview of Cloud Healthcare API concepts
- Store healthcare data with
curl
or PowerShell - Store healthcare data with client libraries
- Store healthcare data with the gcloud CLI
DICOM
- Create and manage DICOM stores
- Connect a PACS to the Cloud Healthcare API
- Use the DICOMweb standard
- Import and export DICOM data using Cloud Storage
See the DICOM conformance statement for information on how the Cloud Healthcare API implements the DICOMweb standard.
FHIR
- Create and manage FHIR stores
- Create and manage FHIR resources
- Import and export FHIR data using Cloud Storage
See the FHIR conformance statement for information on how the Cloud Healthcare API implements the FHIR standard.