This page describes how to change the storage class for DICOM data in Cloud Healthcare API.
Overview
Cloud Healthcare API offers the following storage classes:
- Standard
- Nearline
- Coldline
- Archive
These storage classes are similar to the ones in Cloud Storage.
You can change the storage class of your DICOM object as a cost-saving measure in scenarios such as the following:
- Moving an infrequently accessed series of DICOM images from standard storage to nearline or coldline storage. This frees up the standard storage buckets to store images that are more frequently accessed.
- Moving patient data from standard storage to archive storage to archive data that needs to be retained for legal reasons. Archive storage is the lowest-cost storage class with high durability.
Methods to change DICOM storage class
By default, a DICOM instance has a standard storage class. You can change the storage class using the following methods:
- The
import
method: set the storage class when you import a DICOM instance from a Cloud Storage bucket with theblobStorageSettings
field - The
storeInstances
method: set the storage class when you store a DICOM instance from a local path - The
setBlobStorageSettings
method: change the storage class of a DICOM instance, in a DICOM store at the instance, series, or study level
Before you begin
Before you change the storage class of your DICOM instances in Cloud Healthcare API, review the following pricing details:
Change the storage class of a DICOM instance
The following samples show how to change the storage class of a DICOM instance at an instance, series, or study level.
At an instance level
To change the storage class of a DICOM instance at an instance level, complete these steps:
REST
Use the
projects.locations.datasets.dicomStores.studies.series.instances.setBlobStorageSettings
method.
- Change the storage class of the DICOM instance at the instance
level.
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTUDY_INSTANCE_UID
: the study instance unique identifierSERIES_INSTANCE_UID
: the series instance unique identifierINSTANCE_UID
: the instance unique identifierSTORAGE_CLASS
: the storage class for the DICOM object in the DICOM store fromSTANDARD
,NEARLINE
,COLDLINE
, andARCHIVE
Request JSON body:
{ "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instance/INSTANCE_UID:setBlobStorageSettings"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instance/INSTANCE_UID:setBlobStorageSettings" | Select-Object -Expand Content - Get the status of the long-running operation.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the ID of your Google Cloud project
- LOCATION: the dataset location
- DATASET_ID: the DICOM store's parent dataset
- OPERATION_ID: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
At a series level
To change the storage class of a DICOM instance at a series level, complete these steps:
REST
Use the
projects.locations.datasets.dicomStores.studies.series.setBlobStorageSettings
method.
- Change the storage class of the DICOM instance at the series
level.
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTUDY_INSTANCE_UID
: the study instance unique identifierSERIES_INSTANCE_UID
: the series instance unique identifierSTORAGE_CLASS
: the storage class for the DICOM object in the DICOM store fromSTANDARD
,NEARLINE
,COLDLINE
, andARCHIVE
Request JSON body:
{ "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID:setBlobStorageSettings"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID:setBlobStorageSettings" | Select-Object -Expand Content - Get the status of the long-running operation.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the ID of your Google Cloud project
- LOCATION: the dataset location
- DATASET_ID: the DICOM store's parent dataset
- OPERATION_ID: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
At a study level
To change the storage class of a DICOM instance at a study level, complete these steps:
REST
Use the
projects.locations.datasets.dicomStores.studies.setBlobStorageSettings
method.
- Change the storage class of the DICOM instance at the study
level.
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTUDY_INSTANCE_UID
: the study instance unique identifierSTORAGE_CLASS
: the storage class for the DICOM object in the DICOM store fromSTANDARD
,NEARLINE
,COLDLINE
, andARCHIVE
Request JSON body:
{ "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID:setBlobStorageSettings"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID:setBlobStorageSettings" | Select-Object -Expand Content - Get the status of the long-running operation.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the ID of your Google Cloud project
- LOCATION: the dataset location
- DATASET_ID: the DICOM store's parent dataset
- OPERATION_ID: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
Use filters
The following section shows how to filter specific instances and change their storage class at study, series, or instance level using a filter file.
You must add the Storage Object Viewer role to the Healthcare Service Agent service account, which has access to the bucket where the filter file is stored.
Configure a filter file
Using a filter file you can define the list of DICOM files whose storage class needs to be changed.
Each line in the filter file defines the study, series, or instance and
uses the format
/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID
.
You can truncate a line to specify the level at which the filter works.
For example, you can change the storage class of an entire study by specifying
/studies/STUDY_INSTANCE_UID
, or you can change the storage class of an
entire series by specifying
/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID
.
Consider the following filter file:
/studies/1.123.456.789 /studies/1.666.333.111/series/123.456 /studies/1.666.333.111/series/567.890 /studies/1.888.999.222/series/123.456/instances/111 /studies/1.888.999.222/series/123.456/instances/222 /studies/1.888.999.222/series/123.456/instances/333
This filter file results in filtering the following:
- The entire study with the study instance UID as
1.123.456.789
- Two separate series with series instance UIDs as
123.456
and567.890
in the study1.666.333.111
- Three individual instances with instance IDs as
111
,222
, and333
in the study1.888.999.222
and series123.456
Create a filter file using BigQuery
To create a filter file using BigQuery you must export the metadata of your DICOM store to BigQuery. The exported metadata shows you the study, series, and instance UIDs of the DICOM data in your DICOM store.
After exporting, complete the following steps:
Query for the required study, series, and instance UIDs.
For example, after exporting DICOM metadata to BigQuery, run the following query to concatenate the study, series, and instance UIDs to match the filter file format requirements:
SELECT CONCAT ('/studies/', StudyInstanceUID, '/series/', SeriesInstanceUID, '/instances/', SOPInstanceUID) FROM [PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE]
Optional: If the query returns a large result set that exceed the maximum response size, save the query results to a new destination table in BigQuery.
Save the query results to a file and export it to Cloud Storage. If you saved your query results to a new destination table in Step 2, see Exporting table data to export the table's contents to Cloud Storage.
Edit the exported file as necessary, and use it as the filter file. The location of the filter file in Cloud Storage is required in the
setBlobStorageSettings
method.
Create a filter file manually
To create a filter file with custom content and upload it to a Cloud Storage bucket, complete these steps:
Create a filter file containing a list of instances whose storage class needs to be changed on your local machine. Use the format described in the Configure filter file section.
Upload the filter text file to a Cloud Storage location.
gsutil cp PATH_TO_FILTER_FILE/FILTER_FILE_NAME.txt gs://BUCKET/DIRECTORY
Replace the following:
PATH_TO_FILTER_FILE
: the path to the filter file on your local machineFILTER_FILE_NAME
: the name of the filter fileBUCKET/DIRECTORY
: the path to the Cloud Storage location
For example:
gsutil cp my-local-folder/archive-filters.txt gs://my-bucket/my-directory
Pass the filter file
REST
Use the
projects.locations.datasets.dicomStores.studies.setBlobStorageSettings
method to change the storage class of all the instances in the filter file at a DICOM store level.Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTORAGE_CLASS
: the storage class for the DICOM object in the DICOM store fromSTANDARD
,NEARLINE
,COLDLINE
, andARCHIVE
BUCKET/DIRECTORY
Request JSON body:
{ "filter_config": { "resource_paths_gcs_uri": "gs://BUCKET/DIRECTORY" }, "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } }
To send your request, choose one of these options:
curl
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:cat > request.json << 'EOF' { "filter_config": { "resource_paths_gcs_uri": "gs://BUCKET/DIRECTORY" }, "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } EOF
Then execute the following command to send your REST request:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
-d @request.json \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:setBlobStorageSettings"PowerShell
Save the request body in a file named
request.json
. Run the following command in the terminal to create or overwrite this file in the current directory:@' { "filter_config": { "resource_paths_gcs_uri": "gs://BUCKET/DIRECTORY" }, "blobStorageSettings": { "blob_storage_class": "STORAGE_CLASS" } } '@ | Out-File -FilePath request.json -Encoding utf8
Then execute the following command to send your REST request:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json" `
-InFile request.json `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:setBlobStorageSettings" | Select-Object -Expand ContentGet the status of the long-running operation.
Before using any of the request data, make the following replacements:
- PROJECT_ID: the ID of your Google Cloud project
- LOCATION: the dataset location
- DATASET_ID: the DICOM store's parent dataset
- OPERATION_ID: the ID returned from the long-running operation
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
Check the storage class of a DICOM instance
Use the getStorageInfo
method
The following samples show how to view the storage class of a DICOM instance.
REST
Use theprojects.locations.datasets.dicomStores.dicomWeb.studies.series.instances.getStorageInfo
method.
Before using any of the request data, make the following replacements:
PROJECT_ID
: the ID of your Google Cloud projectLOCATION
: the dataset locationDATASET_ID
: the DICOM store's parent datasetDICOM_STORE_ID
: the DICOM store IDSTUDY_INSTANCE_UID
: the study instance unique identifierSERIES_INSTANCE_UID
: the series instance unique identifierINSTANCE_UID
: the instance unique identifier
To send your request, choose one of these options:
curl
Execute the following command:
curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID:getStorageInfo"
PowerShell
Execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }
Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID/dicomWeb/studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID:getStorageInfo" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Query exported DICOM metadata in BigQuery
You can also export DICOM metadata to BigQuery and query the BigQuery dataset to view the storage classes of DICOM instances.
For example, you can run the following query to view the study instance UID, the series instance UID, the instance UID, the blob storage size, and the blob storage class of all the instances in your BigQuery dataset:
SELECT StudyInstanceUID,SeriesInstanceUID,SOPInstanceUID,BlobStorageSize,StorageClass FROM PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE LIMIT 1000