This page describes how to manage DICOM data in the Cloud Healthcare API using different storage classes. Choosing the right storage class can help you reduce costs and meet regulatory requirements for data retention.
This page is intended for technical users already familiar with DICOM and the Cloud Healthcare API.
Overview
DICOM storage classes function similarly to Cloud Storage storage classes, offering different cost and performance characteristics based on how frequently you access your data and how long you need to store it. For more information about each storage class, see Class descriptions.
You might want to change the storage class of DICOM objects depending on how often you access the object or how long the object needs to be kept. For example:
- You can move rarely accessed DICOM images from Standard storage to Nearline or Coldline storage to save on billing costs.
- You can move patient data that needs to be kept for legal reasons to Archive storage, which is the cheapest and most durable storage class.
Available DICOM storage classes
You can use the following storage classes for your DICOM objects:
- Standard (Default)
- Nearline
- Coldline
- Archive
Storage class pricing
Each storage class has its own pricing structure. Changing the storage class of your DICOM objects might impact your billing costs. For more information, see the following:
Change storage class for individual DICOM objects
You can change the storage class of DICOM objects at the study, series, or instance level.
The following samples show how to change the storage class of a DICOM instance.
REST
Change the storage class of the DICOM instance using the
projects.locations.datasets.dicomStores.studies.series.instances.setBlobStorageSettings
method.Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTUDY_INSTANCE_UID
: the study instance unique identifierSERIES_INSTANCE_UID
: the series instance unique identifierINSTANCE_UID
: the instance unique identifierSTORAGE_CLASS
: the storage class for the DICOM instance. One ofSTANDARD
,NEARLINE
,COLDLINE
, orARCHIVE
.
Request JSON body:
{ "blobStorageSettings": { "blobStorageClass": "STORAGE_CLASS" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "blobStorageSettings": { "blobStorageClass": "STORAGE_CLASS" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID:setBlobStorageSettings"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "blobStorageSettings": { "blobStorageClass": "STORAGE_CLASS" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID:setBlobStorageSettings" | Select-Object -Expand ContentAPIs Explorer
Copy the request body and open the method reference page. The APIs Explorer panel opens on the right side of the page. You can interact with this tool to send requests. Paste the request body in this tool, complete any other required fields, and click Execute.
OPERATION_ID
. You need this value in the next step.Get the status of the long-running operation using the
projects.locations.datasets.operations.get
method.Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetOPERATION_ID
: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content"done": true
, the LRO has finished.
Change storage class for multiple objects using a filter file
The following sections show how to create and use a filter file to change the storage class of DICOM objects based on a filter criteria.
Filter file requirements
- Each line in the filter file defines the study, series, or instance and
uses the format
/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID
. - You can truncate a line to specify the level at which the filter works.
For example, you can select an entire study by specifying
/studies/STUDY_INSTANCE_UID
, or you can select an entire series by specifying/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID
.
Consider the following filter file:
/studies/1.123.456.789 /studies/1.666.333.111/series/123.456 /studies/1.666.333.111/series/567.890 /studies/1.888.999.222/series/123.456/instances/111 /studies/1.888.999.222/series/123.456/instances/222 /studies/1.888.999.222/series/123.456/instances/333
This example filter file applies to the following:
- The entire study with the study instance UID as
1.123.456.789
- Two separate series with series instance UIDs as
123.456
and567.890
in the study1.666.333.111
- Three individual instances with instance IDs as
111
,222
, and333
in the study1.888.999.222
and series123.456
Create a filter file using BigQuery
To create a filter file using BigQuery, you must first export the metadata of your DICOM store to BigQuery. The exported metadata shows you the study, series, and instance UIDs of the DICOM data in your DICOM store.
After exporting the metadata, complete the following steps:
Run a query to return the UIDs of the study, series, and instances you want to add to the filter file.
For example, the following query shows how to concatenate the study, series, and instance UIDs to match the filter file format requirements:
SELECT CONCAT ('/studies/', StudyInstanceUID, '/series/', SeriesInstanceUID, '/instances/', SOPInstanceUID) FROM [PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE]
Optional: If the query returns a large result set that exceed the maximum response size, save the query results to a new destination table in BigQuery.
Save the query results to a file and export it to Cloud Storage. If you saved your query results to a new destination table in Step 2, see Exporting table data to export the table's contents to Cloud Storage.
Edit the exported file as necessary, and include it your request to change the storage class of multiple DICOM objects.
Create a filter file manually
To create a filter file manually, do the following:
- Create a filter file containing the DICOM objects you're filtering on.
- Upload the filter file to Cloud Storage. For instructions, see Upload objects from a file system.
Use a filter file
The following samples show how to apply a filter file when changing the storage class of DICOM objects.
REST
Change the storage class of the DICOM instances in the filter file using the
projects.locations.datasets.dicomStores.studies.series.instances.setBlobStorageSettings
method.Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTORAGE_CLASS
: the storage class for the DICOM objects. One ofSTANDARD
,NEARLINE
,COLDLINE
, orARCHIVE
.CLOUD_STORAGE_BUCKET
: the name of the Cloud Storage bucket containing the filter fileFILTER_FILE_PATH
: the fully qualified URI to the filter file in the Cloud Storage bucket
Request JSON body:
{ "blobStorageSettings": { "blobStorageClass": "STORAGE_CLASS" }, "filterConfig": { "resourcePathsGcsUri": "gs://CLOUD_STORAGE_BUCKET/FILTER_FILE_PATH" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "blobStorageSettings": { "blobStorageClass": "STORAGE_CLASS" }, "filterConfig": { "resourcePathsGcsUri": "gs://CLOUD_STORAGE_BUCKET/FILTER_FILE_PATH" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:setBlobStorageSettings"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "blobStorageSettings": { "blobStorageClass": "STORAGE_CLASS" }, "filterConfig": { "resourcePathsGcsUri": "gs://CLOUD_STORAGE_BUCKET/FILTER_FILE_PATH" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:setBlobStorageSettings" | Select-Object -Expand ContentOPERATION_ID
. You need this value in the next step.Get the status of the long-running operation using the
projects.locations.datasets.operations.get
method.Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetOPERATION_ID
: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content"done": true
, the LRO has finished.
View a DICOM object's storage class
You can view the storage class of DICOM objects at the study, series, or instance level.
The following sections describe how to view the storage class of a DICOM instance.
Get storage class information for a DICOM object
The following samples show how to use
the instances.getStorageInfo
method to view the storage class of DICOM objects.
REST
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTUDY_INSTANCE_UID
: the study instance unique identifierSERIES_INSTANCE_UID
: the series instance unique identifierINSTANCE_UID
: the instance unique identifier
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID:getStorageInfo"
PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID:getStorageInfo" | Select-Object -Expand Content
APIs Explorer
Open the method reference page. The APIs Explorer panel opens on the right side of the page. You can interact with this tool to send requests. Complete any required fields and click Execute.
You should receive a JSON response similar to the following:
Query exported DICOM metadata in BigQuery
You can export DICOM metadata to BigQuery and then run queries to view the storage classes of your exported DICOM objects.
The following query shows how to retrieve the study instance UID, series instance UID, instance UID, storage size, and storage class of up to 1,000 DICOM instances from a BigQuery dataset:
SELECT StudyInstanceUID, SeriesInstanceUID, SOPInstanceUID, BlobStorageSize, StorageClass FROM PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE LIMIT 1000
Replace the following:
PROJECT_ID
: the ID of your Google Cloud projectBIGQUERY_DATASET
: the parent BigQuery dataset of the table containing the exported DICOM metadataBIGQUERY_TABLE
: the BigQuery table containing the exported DICOM metadata