This page explains how to export DICOM instances to and import DICOM objects from Cloud Storage. A DICOM instance is typically an image, but can be another type of persistent data such as a structured report. A Cloud Storage object is a DICOM instance that resides in Cloud Storage.
You can import and export bulk data between a Cloud Storage bucket and a DICOM store. For example, you might have many DICOM instance files that you want to import into a DICOM store. Rather than programmatically storing the data directly, store the data in a Cloud Storage bucket and then import the files into a DICOM store using a single import operation. For more information, see Cloud Storage.
To store a DICOM instance directly, such as from your local machine, store the DICOM data using the Store Transaction RESTful web service as implemented in the Cloud Healthcare API. To retrieve a single instance or study from a DICOM store, retrieve DICOM data using the Retrieve Transaction RESTful web service as implemented in the Cloud Healthcare API.
Setting Cloud Storage permissions
Before exporting and importing DICOM data to and from Cloud Storage, you must grant extra permissions to the Cloud Healthcare Service Agent service account. For more information, see DICOM store Cloud Storage permissions.
Importing DICOM objects
The following samples show how to import DICOM objects from a Cloud Storage bucket.
Console
To import DICOM objects from a Cloud Storage bucket, complete the following steps:
- In the Google Cloud console, go to the Datasets page.
Go to the Datasets page - Click the dataset that contains the DICOM store to which you are importing DICOM objects.
- In the list of data stores, choose Import from the Actions list
for the DICOM store.
The Import to DICOM store page appears. - In the Project list, select a Cloud Storage project.
- In the Location list, select a Cloud Storage bucket.
- To set a specific location for importing files, do the following:
- Expand Advanced Options.
- Select Override Cloud Storage Path.
- To set a specific source for importing files, define the path
using the following variables in the Location text box:
*
- matches non-separator characters.**
- matches characters, including separators. This can be used with a file name extension to match all files of the same type.?
- matches 1 character.
- Click Import to import DICOM objects from the defined source.
- To track the status of the operation, click the Operations tab. After the operation
completes, the following indications appear:
- The Long-running operation status section has a green check mark under the OK heading.
- The Overview section has a green check mark and an OK indicator in the same row as the operation ID.
gcloud
To import DICOM objects from a Cloud Storage bucket, use the
gcloud healthcare dicom-stores import gcs
command. Specify the name of the parent dataset, the name of the DICOM store,
and the location of the object in a Cloud Storage bucket.
- The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following sample.
- When specifying the location of the DICOM objects in Cloud Storage, you
can use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.dcm
matches Example.dcm and Example22.dcm in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .dcm), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.dcm
imports all files with the .dcm filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.dcm
matches Example1.dcm but does not match Example.dcm or Example01.dcm.
- Use
The following sample shows how to import DICOM objects from a Cloud Storage bucket.
gcloud healthcare dicom-stores import gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm
The command line displays the operation ID:
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command and provide OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
After the command completes, the response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': "..."
API
To import DICOM objects from a Cloud Storage bucket, use the projects.locations.datasets.dicomStores.import
method.
- The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following samples.
- When specifying the location of the DICOM objects in Cloud Storage,
use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.dcm
matches Example.dcm and Example22.dcm in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .dcm), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.dcm
imports all files with the .dcm filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.dcm
matches Example1.dcm but does not match Example.dcm or Example01.dcm.
- Use
curl
To import DICOM objects, make a POST
request and provide the following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The location of the objects in a Cloud Storage bucket
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsSource': { 'uri': 'gs://BUCKET/*.dcm' } }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter": { "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES }, }, "done": true, "response": { "@type": "..." } }
PowerShell
To import DICOM objects, make a POST
request and provide the following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The location of the objects in a Cloud Storage bucket
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsSource': { 'uri': 'gs://BUCKET/*.dcm' } }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES } }, "done": true, "response": { "@type": "..." } }
Go
Java
Node.js
Python
Troubleshooting DICOM import requests
If errors occur during a DICOM import request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.
Exporting DICOM instances
The following samples show how to export DICOM instances to a Cloud Storage bucket. When you export DICOM instances from a DICOM store, all instances in the store are exported.
Console
To export DICOM instances to Cloud Storage, complete the following steps:
- In the Google Cloud console, go to the Datasets page.
Go to the Datasets page - Click the dataset that contains the DICOM store from which you are exporting DICOM instances.
- In the list of data stores, choose Export from the Actions list for the DICOM store.
- On the Export DICOM Store page that appears, select Google Cloud Storage Bucket.
- In the Project list, select a Cloud Storage project.
- In the Location list, select a Cloud Storage bucket.
- In DICOM Export Settings, select the file type used to export
the DICOM instances. The following types are available:
- DICOM file (
.dcm
) - octet-stream
- Image (
.jpg
,.png
)
- DICOM file (
- To define additional transfer syntax, choose the syntax from the Transfer Syntax list.
- Click Export to export DICOM instances to the defined location in Cloud Storage.
- To track the status of the operation, click the Operations tab. After the operation
completes, the following indications appear:
- The Long-running operation status section has a green check mark under the OK heading.
- The Overview section has a green check mark and an OK indicator in the same row as the operation ID.
gcloud
To export DICOM instances to a Cloud Storage bucket, use the
gcloud healthcare dicom-stores export gcs
command.
- Provide the name of the parent dataset, the name of the DICOM store, and the destination Cloud Storage bucket.
- Write to a Cloud Storage bucket or directory, rather than an object,
because the Cloud Healthcare API creates one
.dcm
file for each object. - If the command specifies a directory that does not exist, the directory is created.
The following sample shows the gcloud healthcare dicom-stores export gcs
command.
gcloud healthcare dicom-stores export gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri-prefix=gs://BUCKET/DIRECTORY
The command line displays the operation ID:
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command and provide OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
After the command completes, the response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': "..."
API
To export DICOM instances to a Cloud Storage bucket, use the
projects.locations.datasets.dicomStores.export
method.
- Write to a Cloud Storage bucket or directory, rather than an object, because
the Cloud Healthcare API creates one
.dcm
file for each DICOM object. - If the command specifies a directory that does not exist, the directory is created.
curl
To export DICOM instances, make a POST
request and provide the following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' } }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES } }, "done": true, "response": { "@type": "..." } }
PowerShell
To export DICOM instances, make a POST
request and provide the following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' } }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter":{ "success": SUCCESSFUL_INSTANCES "failure": FAILED_INSTANCES }, }, "done": true, "response": { "@type": "..." } }
Go
Java
Node.js
Python
Exporting DICOM instances using filters
By default, when you export DICOM files to Cloud Storage, all of the DICOM files in the specified DICOM store are exported. Similarly, when you export DICOM metadata to BigQuery, the metadata for all of the DICOM data in the specified DICOM store is exported.
You can export a subset of DICOM data or metadata using a filter. You define the filter in a filter file.
Configuring filter files
A filter file defines which DICOM files to export to Cloud Storage or BigQuery. You can configure filter files at the following levels:
- At the study level
- At the series level
- At the instance level
The filter file is made up of multiple lines with each line defining the
study, series, or instance you want
to export. Each line uses the format /studies/STUDY_UID[/series/SERIES_UID[/instances/INSTANCE_UID]]
.
If a study, series, or instance is not specified in the filter file when you pass in the filter file, that study, series, or instance will not be exported.
Only the /studies/STUDY_UID
portion of the path is
required. You can export an entire study by specifying
/studies/STUDY_UID
, or you can export an entire
series by specifying /studies/STUDY_UID/series/SERIES_UID
.
Consider the following filter file. The filter file will result in one study, two series, and three individual instances being exported:
/studies/1.123.456.789 /studies/1.666.333.111/series/123.456 /studies/1.666.333.111/series/567.890 /studies/1.888.999.222/series/123.456/instances/111 /studies/1.888.999.222/series/123.456/instances/222 /studies/1.888.999.222/series/123.456/instances/333
Creating a filter file using BigQuery
You typically create a filter file by first exporting the metadata from a DICOM store to BigQuery. This lets you use BigQuery to view the study, series, and instance UIDs of the DICOM data in your DICOM store. You can then complete the following steps:
-
Query for the study, series, and instance UIDs you are interested in.
For example, after exporting DICOM metadata to BigQuery, you
could run the following query to concatenate the study, series, and
instance UIDs to a format that's compatible with the filter file
requirements:
SELECT CONCAT ('/studies/', StudyInstanceUID, '/series/', SeriesInstanceUID, '/instances/', SOPInstanceUID) FROM [PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE]
- If the query returns a large result set, you can materialize a new table by saving the query results to a destination table in BigQuery.
- If you saved the query results to a destination table, you can save the contents of the destination table to a file and export it to Cloud Storage. For steps on how to do so, see Exporting table data. The exported file is your filter file. You use the location of the filter file in Cloud Storage when specifying the filter in the export operation.
Creating a filter file manually
You can create a filter file with custom content and upload it to a Cloud Storage bucket. You use the location of the filter file in Cloud Storage when specifying the filter in the export operation. The following sample shows how to upload a filter file to a Cloud Storage bucket using thegsutil cp
command:
gsutil cp PATH/TO/FILTER_FILE gs://BUCKET/DIRECTORY
Passing in the filter file
After you create a filter file, call the DICOM export operation and pass in the filter file using the REST API. The following samples show how to export DICOM data using a filter.
gcloud
To export DICOM metadata to Cloud Storage using a filter, use the
gcloud beta healthcare dicom-stores export gcs
command:
gcloud beta healthcare dicom-stores export gcs DICOM_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri-prefix=gs://DESTINATION_BUCKET/DIRECTORY \ --filter-config-gcs-uri=gs://BUCKET/DIRECTORY/FILTER_FILE
Replace the following:
- DICOM_STORE_ID: the identifier for the DICOM store
- DATASET_ID: the name of the DICOM store's parent dataset
- LOCATION: the location of the DICOM store's parent dataset
- DESTINATION_BUCKET/DIRECTORY: the destination Cloud Storage bucket
- BUCKET/DIRECTORY/FILTER_FILE: the location of the filter file in a Cloud Storage bucket
The output is the following:
Request issued for: [DICOM_STORE_ID] Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command and provide OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --location=LOCATION \ --dataset=DATASET_ID
Replace the following:
- OPERATION_ID: the ID number returned from the previous response
- DATASET_ID: the name of the DICOM store's parent dataset
- LOCATION: the location of the DICOM store's parent dataset
The output is the following:
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata apiMethodName: google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData counter: success: SUCCESSFUL_INSTANCES failure: FAILED_INSTANCES createTime: 'CREATE_TIME' endTime: 'END_TIME' logsUrl: 'https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL' name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': '...'
API
To export DICOM data using a filter, use the projects.locations.datasets.dicomStores.export
method.
curl
To export DICOM data using a filter file, make a POST
request and provide the
following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
- The location of the filter file in a Cloud Storage bucket
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' }, 'filterConfig': { 'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE' } }" "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"
If the request is successful, the server returns the following response in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. Use the
Operation get
method
to track the status of the operation:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"
If the request is successful, the server returns the following response with in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME" }, "done": true, "response": { "@type": "..." } }
PowerShell
To export DICOM data using a filter file, make a POST
request and provide the
following information:
- The name and location of the parent dataset
- The name of the DICOM store
- The destination Cloud Storage bucket
- The location of the filter file in a Cloud Storage bucket
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' }, 'filterConfig': { 'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE' }" ` -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content
If the request is successful, the server returns the following response in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. Use the
Operation get
method
to track the status of the operation:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME" | Select-Object -Expand Content
If the request is successful, the server returns the following response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData", "createTime": "CREATE_TIME", "endTime": "END_TIME" }, "done": true, "response": { "@type": "..." } }
Troubleshooting DICOM export requests
If errors occur during a DICOM export request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.
If the entire operation returns an error, see Troubleshooting long-running operations.