This page explains how to export DICOM instances to and import DICOM objects from Cloud Storage. A DICOM instance is typically an image, but can be another type of persistent data such as a structured report. A DICOM object in Cloud Storage is a DICOM instance that resides in Cloud Storage. For more information, see Cloud Storage.
Setting Cloud Storage permissions
Before exporting and importing DICOM data to and from Cloud Storage, you must grant extra permissions to the Cloud Healthcare Service Agent service account. For more information, see DICOM store Cloud Storage permissions.
Importing DICOM objects
To import several DICOM instance files to a DICOM store, you can use any of the following methods:
- Programmatically store the data directly in a DICOM store from your local machine using the Store Transaction RESTful web service as implemented in the Cloud Healthcare API.
- Upload the DICOM data in a Cloud Storage bucket and then import the files into a DICOM store using a single import operation as explained in this section.
The following samples show how to import DICOM objects from a Cloud Storage bucket.
Console
To import DICOM objects from a Cloud Storage bucket, complete the following steps:
- In the Google Cloud console, go to the Datasets page.
Go to Datasets - Click the dataset that contains the DICOM store to which you are importing DICOM objects.
- In the list of data stores, choose Import from the Actions list
for the DICOM store.
The Import to DICOM store page appears. - In the Project list, select a Cloud Storage project.
- In the Location list, select a Cloud Storage bucket.
- To set a specific location for importing files, do the following:
- Expand Advanced Options.
- Select Override Cloud Storage Path.
- To set a specific source for importing files, define the path
using the following variables in the Location text box:
*
- matches non-separator characters.**
- matches characters, including separators. This can be used with a file name extension to match all files of the same type.?
- matches 1 character.
- Click Import to import DICOM objects from the defined source.
- To track the status of the operation, click the Operations tab. After the operation
completes, the following indications appear:
- The Long-running operation status section has a green check mark under the OK heading.
- The Overview section has a green check mark and an OK indicator in the same row as the operation ID.
gcloud
To import DICOM objects from a Cloud Storage bucket, use the
gcloud healthcare dicom-stores import gcs
command. Specify the name of the parent dataset, the name of the DICOM store,
and the location of the object in a Cloud Storage bucket.
- The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following sample.
- When specifying the location of the DICOM objects in Cloud Storage, you
can use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.dcm
matches Example.dcm and Example22.dcm in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .dcm), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.dcm
imports all files with the .dcm filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.dcm
matches Example1.dcm but does not match Example.dcm or Example01.dcm.
- Use
The following sample shows how to import DICOM objects from a Cloud Storage bucket.
gcloud healthcare dicom-stores import gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm
The command line displays the operation ID:
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command and provide OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
After the command completes, the response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': "..."
API
To import DICOM objects from a Cloud Storage bucket, use the projects.locations.datasets.dicomStores.import
method.
- The location of the files within the bucket can vary and doesn't have to match the format specified in the following samples.
- When specifying the location of the DICOM objects in Cloud Storage,
use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.dcm
matches Example.dcm and Example22.dcm in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .dcm), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.dcm
imports all files with the .dcm filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.dcm
matches Example1.dcm but does not match Example.dcm or Example01.dcm.
- Use
REST
Import the DICOM object.
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDBUCKET/PATH/TO/FILE
: the path to the DICOM object in Cloud Storage
Request JSON body:
{ "gcsSource": { "uri": "gs://BUCKET/PATH/TO/FILE.dcm" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "gcsSource": { "uri": "gs://BUCKET/PATH/TO/FILE.dcm" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "gcsSource": { "uri": "gs://BUCKET/PATH/TO/FILE.dcm" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand ContentGet the status of the long-running operation.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the ID of your Google Cloud project
- LOCATION: the dataset location
- DATASET_ID: the DICOM store's parent dataset
- OPERATION_ID: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
Go
Java
Node.js
Python
To retrieve a single instance or study from a DICOM store, retrieve DICOM data using the Retrieve Transaction RESTful web service as implemented in the Cloud Healthcare API.
Specify a storage class to import DICOM objects (Preview)
By default, the projects.locations.datasets.dicomStores.import
method imports
a DICOM object to a DICOM store with a standard storage class. You can set the
storage class when you import DICOM objects from Cloud Storage.
For more information, see
Change DICOM storage class.
The following samples show how to specify the storage class when you import DICOM objects from Cloud Storage.
REST
Use the projects.locations.datasets.dicomStores.import
method.
Import the DICOM object.
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDBUCKET/PATH/TO/FILE
: the path to the DICOM object in Cloud StorageSTORAGE_CLASS
: the storage class for the DICOM object in the DICOM store fromSTANDARD
,NEARLINE
,COLDLINE
, andARCHIVE
Request JSON body:
{ "gcsSource": { "uri": "gs://BUCKET/PATH/TO/FILE.dcm" }, "blob_storage_settings": { "blob_storage_class": "STORAGE_CLASS" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "gcsSource": { "uri": "gs://BUCKET/PATH/TO/FILE.dcm" }, "blob_storage_settings": { "blob_storage_class": "STORAGE_CLASS" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "gcsSource": { "uri": "gs://BUCKET/PATH/TO/FILE.dcm" }, "blob_storage_settings": { "blob_storage_class": "STORAGE_CLASS" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand ContentGet the status of the long-running operation.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the ID of your Google Cloud project
- LOCATION: the dataset location
- DATASET_ID: the DICOM store's parent dataset
- OPERATION_ID: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
Troubleshooting DICOM import requests
If errors occur during a DICOM import request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.
Exporting DICOM instances
The following samples show how to export DICOM instances to a Cloud Storage bucket. When you export DICOM instances from a DICOM store, all instances in the store are exported.
Console
To export DICOM instances to Cloud Storage, complete the following steps:
- In the Google Cloud console, go to the Datasets page.
Go to Datasets - Click the dataset that contains the DICOM store from which you are exporting DICOM instances.
- In the list of data stores, choose Export from the Actions list for the DICOM store.
- On the Export DICOM Store page that appears, select Google Cloud Storage Bucket.
- In the Project list, select a Cloud Storage project.
- In the Location list, select a Cloud Storage bucket.
- In DICOM Export Settings, select the file type used to export
the DICOM instances. The following types are available:
- DICOM file (
.dcm
) - octet-stream
- Image (
.jpg
,.png
)
- DICOM file (
- To define additional transfer syntax, choose the syntax from the Transfer Syntax list.
- Click Export to export DICOM instances to the defined location in Cloud Storage.
- To track the status of the operation, click the Operations tab. After the operation
completes, the following indications appear:
- The Long-running operation status section has a green check mark under the OK heading.
- The Overview section has a green check mark and an OK indicator in the same row as the operation ID.
gcloud
To export DICOM instances to a Cloud Storage bucket, use the
gcloud healthcare dicom-stores export gcs
command.
- Provide the name of the parent dataset, the name of the DICOM store, and the destination Cloud Storage bucket.
- Write to a Cloud Storage bucket or directory, rather than an object,
because the Cloud Healthcare API creates one
.dcm
file for each object. - If the command specifies a directory that does not exist, the directory is created.
The following sample shows the gcloud healthcare dicom-stores export gcs
command.
gcloud healthcare dicom-stores export gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri-prefix=gs://BUCKET/DIRECTORY
The command line displays the operation ID:
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command and provide OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
After the command completes, the response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': "..."
API
To export DICOM instances to a Cloud Storage bucket, use the
projects.locations.datasets.dicomStores.export
method.
- Write to a Cloud Storage bucket or directory, rather than an object, because
the Cloud Healthcare API creates one
.dcm
file for each DICOM object. - If the command specifies a directory that does not exist, the directory is created.
curl
To export DICOM instances, make a POST
request and provide the following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' } }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES } }, "done": true, "response": { "@type": "..." } }
PowerShell
To export DICOM instances, make a POST
request and provide the following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' } }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES }, }, "done": true, "response": { "@type": "..." } }
Go
Java
Node.js
Python
Exporting DICOM instances using filters
By default, when you export DICOM files to Cloud Storage all the DICOM files in the DICOM store are exported. Similarly, when you export DICOM metadata to BigQuery, the metadata for all of the DICOM data in the DICOM store is exported.
You can export a subset of DICOM data or metadata using a filter file.
Configure a filter file
- Each line in the filter file defines the study, series, or instance and
uses the format
/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID
. - You can truncate a line to specify the level at which the filter works.
For example, you can select an entire study by specifying
/studies/STUDY_INSTANCE_UID
, or you can select an entire series by specifying/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID
.
Consider the following filter file:
/studies/1.123.456.789 /studies/1.666.333.111/series/123.456 /studies/1.666.333.111/series/567.890 /studies/1.888.999.222/series/123.456/instances/111 /studies/1.888.999.222/series/123.456/instances/222 /studies/1.888.999.222/series/123.456/instances/333
This example filter file applies to the following:
- The entire study with the study instance UID as
1.123.456.789
- Two separate series with series instance UIDs as
123.456
and567.890
in the study1.666.333.111
- Three individual instances with instance IDs as
111
,222
, and333
in the study1.888.999.222
and series123.456
Create a filter file using BigQuery
To create a filter file using BigQuery, you must first export the metadata of your DICOM store to BigQuery. The exported metadata shows you the study, series, and instance UIDs of the DICOM data in your DICOM store.
After exporting the metadata, complete the following steps:
Run a query to return the UIDs of the study, series, and instances you want to add to the filter file.
For example, the following query shows how to concatenate the study, series, and instance UIDs to match the filter file format requirements:
SELECT CONCAT ('/studies/', StudyInstanceUID, '/series/', SeriesInstanceUID, '/instances/', SOPInstanceUID) FROM [PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE]
Optional: If the query returns a large result set that exceed the maximum response size, save the query results to a new destination table in BigQuery.
Save the query results to a file and export it to Cloud Storage. If you saved your query results to a new destination table in Step 2, see Exporting table data to export the table's contents to Cloud Storage.
Edit the exported file as necessary, and include it your request to change the storage class of multiple DICOM objects.
Create a filter file manually
To create a filter file manually, do the following:
- Create a filter file containing the DICOM objects you're filtering on.
- Upload the filter file to Cloud Storage. For instructions, see Upload objects from a file system.
Passing in the filter file
After you create a filter file, call the DICOM export operation and pass in the filter file using the REST API. The following samples show how to export DICOM data using a filter.
gcloud
To export DICOM metadata to Cloud Storage using a filter, use the
gcloud beta healthcare dicom-stores export gcs
command:
gcloud beta healthcare dicom-stores export gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri-prefix=gs://DESTINATION_BUCKET/DIRECTORY \ --filter-config-gcs-uri=gs://BUCKET/DIRECTORY/FILTER_FILE
Replace the following:
- DICOM_STORE_ID: the identifier for the DICOM store
- DATASET_ID: the name of the DICOM store's parent dataset
- LOCATION: the location of the DICOM store's parent dataset
- DESTINATION_BUCKET/DIRECTORY: the destination Cloud Storage bucket
- BUCKET/DIRECTORY/FILTER_FILE: the location of the filter file in a Cloud Storage bucket
The output is the following:
Request issued for: [DICOM_STORE_ID] Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command and provide OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
Replace the following:
- OPERATION_ID: the ID number returned from the previous response
- DATASET_ID: the name of the DICOM store's parent dataset
- LOCATION: the location of the DICOM store's parent dataset
The output is the following:
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata apiMethodName: google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: 'CREATE_TIME' endTime: 'END_TIME' logsUrl: 'https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL' name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': '...'
API
To export DICOM data using a filter, use the projects.locations.datasets.dicomStores.export
method.
curl
To export DICOM data using a filter file, make a POST
request and provide the
following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
- The location of the filter file in a Cloud Storage bucket
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' }, 'filterConfig': { 'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE' } }" "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"
If the request is successful, the server returns the following response in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. Use the
Operation get
method
to track the status of the operation:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"
If the request is successful, the server returns the following response with in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME" }, "done": true, "response": { "@type": "..." } }
PowerShell
To export DICOM data using a filter file, make a POST
request and provide the
following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
- The location of the filter file in a Cloud Storage bucket
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' }, 'filterConfig': { 'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE' }" ` -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content
If the request is successful, the server returns the following response in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. Use the
Operation get
method
to track the status of the operation:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME" | Select-Object -Expand Content
If the request is successful, the server returns the following response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME" }, "done": true, "response": { "@type": "..." } }
Troubleshooting DICOM export requests
If errors occur during a DICOM export request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.
If the entire operation returns an error, see Troubleshooting long-running operations.