This page explains how to export DICOM instances to and import DICOM objects from Cloud Storage. A DICOM instance is typically an image, but can be another type of persistent data such as a structured report. A Cloud Storage object is a DICOM instance that resides in Cloud Storage.
You can import and export bulk data between a Cloud Storage bucket and a DICOM store. For example, you might have many DICOM instance files that you want to import into a DICOM store. Rather than programmatically storing the data directly, you can store the data in a Cloud Storage bucket and then import the files into a DICOM store using a single import operation. For more information, see Cloud Storage.
To store a DICOM instance directly, such as from your local machine, you can store the DICOM data using the Store Transaction RESTful web service as implemented in the Cloud Healthcare API. To retrieve a single instance or study from a DICOM store, you can retrieve DICOM data using the Retrieve Transaction RESTful web service as implemented in the Cloud Healthcare API.
Setting Cloud Storage permissions
Before exporting and importing DICOM data to and from Cloud Storage, you must grant extra permissions to the Cloud Healthcare Service Agent service account. For more information, see DICOM store Cloud Storage permissions.
Importing DICOM objects
The following samples show how to import DICOM objects from a Cloud Storage bucket.
Console
To import DICOM objects from a Cloud Storage bucket, complete the following steps:
In the Cloud Console, go to the Datasets page.
Click the dataset for which you are importing DICOM objects.
In the list of DICOM stores, choose Import from the Actions list.
The Import to DICOM store page appears.
In the Project list, select a Cloud Storage project.
In the Location list, select a Cloud Storage bucket.
To set a specific location for importing files, do the following:
- Expand Advanced Options.
- Select Override Cloud Storage Path.
- To set a specific source for importing files, define the path using
the following variables in the Location text box:
*
- matches non-separator characters.**
- matches characters, including separators. This can be used with a file name extension to match all files of the same type.?
- matches 1 character.
Click Import to import DICOM objects from the defined source.
gcloud
To import DICOM objects from a Cloud Storage bucket, use the
gcloud healthcare dicom-stores import gcs
command. Specify the name of the parent dataset, the name of the DICOM store,
and the location of the object in a Cloud Storage bucket.
- The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following sample.
- When specifying the location of the DICOM objects in Cloud Storage, you
can use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.dcm
matches Example.dcm and Example22.dcm in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .dcm), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.dcm
imports all files with the .dcm filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.dcm
matches Example1.dcm but does not match Example.dcm or Example01.dcm.
- Use
The following sample shows how to import DICOM objects from a Cloud Storage bucket.
gcloud healthcare dicom-stores import gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm
The command line displays the operation ID:
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command, providing the OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
After the command completes, the response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': "..."
API
To import DICOM objects from a Cloud Storage bucket, use the projects.locations.datasets.dicomStores.import
method.
- The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following samples.
- When specifying the location of the DICOM objects in Cloud Storage,
you can use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.dcm
matches Example.dcm and Example22.dcm in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .dcm), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.dcm
imports all files with the .dcm filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.dcm
matches Example1.dcm but does not match Example.dcm or Example01.dcm.
- Use
curl
To import DICOM objects, make a POST
request and provide the name of the
parent dataset, the name of the DICOM store, the location of the object in a
Cloud Storage bucket, and an access token.
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsSource': { 'uri': 'gs://BUCKET/*.dcm' } }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL", "counter": { "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES }, }, "done": true, "response": { "@type": "..." } }
PowerShell
To import DICOM objects, make a POST
request and provide the name of the
parent dataset, the name of the DICOM store, the location of the object in a
Cloud Storage bucket, and an access token.
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsSource': { 'uri': 'gs://BUCKET/*.dcm' } }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES } }, "done": true, "response": { "@type": "..." } }
Go
Java
Node.js
Python
Troubleshooting DICOM import requests
If errors occur during a DICOM import request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.
Exporting DICOM instances
The following samples show how to export DICOM instances to a Cloud Storage bucket. When you export DICOM instances from a DICOM store, all instances in the store are exported.
Console
To export DICOM instances to Cloud Storage, complete the following steps:
In the Cloud Console, go to the Datasets page.
Click the dataset for which you are exporting DICOM instances.
In the list of DICOM stores, choose Export from the Actions list.
The Export DICOM Store page displays.
Select Google Cloud Storage Bucket.
In the Project list, select a Cloud Storage project.
In the Location list, select a Cloud Storage bucket.
In DICOM Export Settings, select the file type used to export the DICOM instances. The following types are available:
- DICOM file (
.dcm
) - octet-stream
- Image (
.jpg
,.png
)
- DICOM file (
To define additional transfer syntax, choose the syntax from the Transfer Syntax list.
Click Export to export DICOM instances to the defined location in Cloud Storage.
gcloud
To export DICOM instances to a Cloud Storage bucket, use the
gcloud healthcare dicom-stores export gcs
command.
- Provide the name of the parent dataset, the name of the DICOM store, and the destination Cloud Storage bucket.
- Write to a Cloud Storage bucket or directory, rather than an object,
because the Cloud Healthcare API creates one
.dcm
file for each object. - If the command specifies a directory that does not exist, the directory is created.
The following sample shows the gcloud healthcare dicom-stores export gcs
command.
gcloud healthcare dicom-stores export gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri-prefix=gs://BUCKET/DIRECTORY
The command line displays the operation ID:
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command, providing the OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
After the command completes, the response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': "..."
API
To export DICOM instances to a Cloud Storage bucket, use the
projects.locations.datasets.dicomStores.export
method.
- Write to a Cloud Storage bucket or directory, rather than an object, because
the Cloud Healthcare API creates one
.dcm
file for each DICOM object. - If the command specifies a directory that does not exist, the directory is created.
curl
To export DICOM instances, make a POST
request and provide the name of
the parent dataset, the name of the DICOM store, the destination
Cloud Storage bucket, and an access token.
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' } }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES } }, "done": true, "response": { "@type": "..." } }
PowerShell
To export DICOM instances, make a POST
request and provide the name of the
parent dataset, the name of the DICOM store, the destination
Cloud Storage bucket, and an access token.
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' } }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES }, }, "done": true, "response": { "@type": "..." } }
Go
Java
Node.js
Python
Troubleshooting DICOM export requests
If errors occur during a DICOM export request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.