Importing and exporting DICOM data using Cloud Storage

This page explains how to export DICOM instances to and import DICOM objects from Cloud Storage.

Using Cloud Storage, you can import and export large amounts of DICOM data. For storing a single instance, store DICOM data directly using the STOW-RS RESTful web service. For retrieving a single instance or study, retrieve DICOM data using the WADO-RS RESTful web service as implemented in the Cloud Healthcare API.

A DICOM instance is typically an image, but can be another type of persistent data such as a structured report. A Cloud Storage object is a DICOM instance that resides in Cloud Storage.

Setting Cloud Storage permissions

Before exporting and importing DICOM data to and from Cloud Storage, you must grant additional permissions to the Cloud Healthcare Service Agent service account. For more information, see DICOM store Cloud Storage permissions.

Importing DICOM objects

The following samples show how to import DICOM objects from a Cloud Storage bucket.

gcloud

The following sample works with the v1alpha2 version of the Cloud Healthcare API.

To import DICOM objects from a Cloud Storage bucket, use the gcloud alpha healthcare dicom-stores import command. Specify the name of the parent dataset, the name of the DICOM store, and the location of the object in a Cloud Storage bucket.

  • The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following sample.
  • When specifying the location of the DICOM objects in Cloud Storage, you can use wildcards to import multiple files from one or more directories. The following wildcards are supported:
    • Use * to match 0 or more non-separator characters. For example, gs://BUCKET/DIRECTORY/Example*.dcm matches Example.dcm and Example22.dcm in DIRECTORY.
    • Use ** to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a file name extension (such as .dcm), which imports all files with the file name extension in the specified directory and its sub-directories. For example, gs://BUCKET/DIRECTORY/**.dcm imports all files with the .dcm file name extension in DIRECTORY and its sub-directories.
    • Use ? to match 1 character. For example, gs://BUCKET/DIRECTORY/Example?.dcm matches Example1.dcm but does not match Example.dcm or Example01.dcm.

The following sample shows how to import DICOM objects from a Cloud Storage bucket.

gcloud alpha healthcare dicom-stores import DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=REGION \
  --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm

The command line displays the operation ID:

name: projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID

To view the status of the operation, run the gcloud alpha healthcare operations describe command, providing the OPERATION_ID from the response:

gcloud alpha healthcare operations describe OPERATION_ID \
  --dataset=DATASET_ID

After the command completes, the response includes done: true.

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1alpha2.OperationMetadata
apiMethodName: google.cloud.healthcare.v1alpha2.dicom.FhirService.ImportFhirData
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

To import DICOM objects from a Cloud Storage bucket, use the projects.locations.datasets.dicomStores.import method.

  • The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following samples.
  • When specifying the location of the DICOM objects in Cloud Storage, you can use wildcards to import multiple files from one or more directories. The following wildcards are supported:
    • Use * to match 0 or more non-separator characters. For example, gs://BUCKET/DIRECTORY/Example*.dcm matches Example.dcm and Example22.dcm in DIRECTORY.
    • Use ** to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a file name extension (such as .dcm), which imports all files with the file name extension in the specified directory and its sub-directories. For example, gs://BUCKET/DIRECTORY/**.dcm imports all files with the .dcm file name extension in DIRECTORY and its sub-directories.
    • Use ? to match 1 character. For example, gs://BUCKET/DIRECTORY/Example?.dcm matches Example1.dcm but does not match Example.dcm or Example01.dcm.

curl command

To import DICOM objects, make a POST request and provide the name of the parent dataset, the name of the DICOM store, the location of the object in a Cloud Storage bucket, and an access token.

The following sample shows a POST request using curl.

curl -X POST \
    -H "Authorization: Bearer "$(gcloud auth print-access-token) \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsSource': {
        'uri': 'gs://BUCKET/*.dcm'
      }
    }" "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"

If the request is successful, the server returns a 200 OK HTTP status code and the response in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"
}

The response contains an operation name. You can use the Operation get method to track the status of the operation:

curl -X GET \
    -H "Authorization: Bearer "$(gcloud auth print-access-token) \
    "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"

If the request is successful, the server returns a 200 OK HTTP status code and a response with the status of the operation in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ImportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

To import DICOM objects, make a POST request and provide the name of the parent dataset, the name of the DICOM store, the location of the object in a Cloud Storage bucket, and an access token.

The following sample shows a POST request using Windows PowerShell.

$cred = gcloud auth print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsSource': {
      'uri': 'gs://BUCKET/*.dcm'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content

If the request is successful, the server returns a 200 OK HTTP status code and the response in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"
}

The response contains an operation name. You can use the Operation get method to track the status of the operation:

$cred = gcloud auth print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME" | Select-Object -Expand Content

If the request is successful, the server returns a 200 OK HTTP status code and a response with the status of the operation in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ImportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1beta1"
)

// importDICOMInstance imports DICOM objects from GCS.
func importDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, contentURI string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %v", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ImportDicomDataRequest{
		GcsSource: &healthcare.GoogleCloudHealthcareV1beta1DicomGcsSource{
			Uri: contentURI,
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Import(name, req).Do()
	if err != nil {
		return fmt.Errorf("Import: %v", err)
	}

	fmt.Fprintf(w, "Import to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.HealthcareQuickstart;
import com.google.api.services.healthcare.v1beta1.model.GoogleCloudHealthcareV1beta1DicomGcsSource;
import com.google.api.services.healthcare.v1beta1.model.ImportDicomDataRequest;
import com.google.api.services.healthcare.v1beta1.model.Operation;

import java.io.IOException;

public class DicomStoreImport {
  public static void importDicomStoreInstance(String dicomStoreName, String uri)
      throws IOException {
    GoogleCloudHealthcareV1beta1DicomGcsSource gcsSource =
        new GoogleCloudHealthcareV1beta1DicomGcsSource();
    gcsSource.setUri("gs://" + uri);
    ImportDicomDataRequest importRequest = new ImportDicomDataRequest();
    importRequest.setGcsSource(gcsSource);
    Operation importOperation = HealthcareQuickstart.getCloudHealthcareClient()
        .projects()
        .locations()
        .datasets()
        .dicomStores()
        .healthcareImport(dicomStoreName, importRequest)
        .execute();
    System.out.println("Importing Dicom store op name: " + importOperation.getName());
  }
}

Node.js

function importDicomObject(
  client,
  projectId,
  cloudRegion,
  datasetId,
  dicomStoreId,
  gcsUri
) {
  // Token retrieved in callback
  // getToken(serviceAccountJson, function(cb) {...});
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket'
  const dicomStoreName = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;

  const request = {
    name: dicomStoreName,
    resource: {
      gcsSource: {
        gcsUri: `gs://${gcsUri}`,
      },
    },
  };

  client.projects.locations.datasets.dicomStores
    .import(request)
    .then(() => {
      console.log(`Imported DICOM objects from bucket ${gcsUri}`);
    })
    .catch(err => {
      console.error(err);
    });
}

Python

def import_dicom_instance(
        service_account_json,
        project_id,
        cloud_region,
        dataset_id,
        dicom_store_id,
        content_uri):
    """Import data into the DICOM store by copying it from the specified
    source.
    """
    client = get_client(service_account_json)
    dicom_store_parent = 'projects/{}/locations/{}/datasets/{}'.format(
        project_id, cloud_region, dataset_id)
    dicom_store_name = '{}/dicomStores/{}'.format(
        dicom_store_parent, dicom_store_id)

    body = {
        "gcsSource": {
            "uri": 'gs://{}'.format(content_uri)
        }
    }

    # Escape "import()" method keyword because "import"
    # is a reserved keyword in Python
    request = client.projects().locations().datasets().dicomStores().import_(
        name=dicom_store_name, body=body)

    try:
        response = request.execute()
        print('Imported DICOM instance: {}'.format(content_uri))
        return response
    except HttpError as e:
        print('Error, DICOM instance not imported: {}'.format(e))
        return ""

Exporting DICOM instances

The following samples show how to export DICOM instances to a Cloud Storage bucket. When you export DICOM instances from a DICOM store, all instances in the store are exported.

gcloud

The following sample works with the v1alpha2 version of the Cloud Healthcare API.

To export DICOM instances to a Cloud Storage bucket, use the gcloud alpha healthcare dicom-stores export command.

  • Provide the name of the parent dataset, the name of the DICOM store, and the destination Cloud Storage bucket.
  • Write to a Cloud Storage bucket or directory, rather than an object, because the Cloud Healthcare API creates one .dcm file for each object.
  • If the command specifies a directory that does not exist, the directory is created.

The following sample shows the gcloud alpha healthcare dicom-stores export command.

gcloud alpha healthcare dicom-stores export DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=REGION \
  --gcs-uri-prefix=gs://BUCKET/DIRECTORY

The command line displays the operation ID:

name: projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID

To view the status of the operation, run the gcloud alpha healthcare operations describe command, providing the OPERATION_ID from the response:

gcloud alpha healthcare operations describe OPERATION_ID \
  --dataset=DATASET_ID

After the command completes, the response includes done: true.

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1alpha2.OperationMetadata
apiMethodName: google.cloud.healthcare.v1alpha2.dicom.DicomService.ExportDicomData
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

To export DICOM instances to a Cloud Storage bucket, use the projects.locations.datasets.dicomStores.export method.

  • Write to a Cloud Storage bucket or directory, rather than an object, because the Cloud Healthcare API creates one .dcm file for each DICOM object.
  • If the command specifies a directory that does not exist, the directory is created.

curl command

To export DICOM instances, make a POST request and provide the name of the parent dataset, the name of the DICOM store, the destination Cloud Storage bucket, and an access token.

The following sample shows a POST request using curl.

curl -X POST \
    -H "Authorization: Bearer "$(gcloud auth print-access-token) \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsDestination': {
        'uriPrefix': 'gs://BUCKET/DIRECTORY'
      }
    }" "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"

If the request is successful, the server returns a 200 OK HTTP status code and the response in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"
}

The response contains an operation name. You can use the Operation get method to track the status of the operation:

curl -X GET \
    -H "Authorization: Bearer "$(gcloud auth print-access-token) \
    "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"

If the request is successful, the server returns a 200 OK HTTP status code and a response with the status of the operation in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME"
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

To export DICOM instances, make a POST request and provide the name of the parent dataset, the name of the DICOM store, the destination Cloud Storage bucket, and an access token.

The following sample shows a POST request using Windows PowerShell.

$cred = gcloud auth print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsDestination': {
      'uriPrefix': 'gs://BUCKET/DIRECTORY'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content

If the request is successful, the server returns a 200 OK HTTP status code and the response in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"
}

The response contains an operation name. You can use the Operation get method to track the status of the operation:

$cred = gcloud auth print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME" | Select-Object -Expand Content

If the request is successful, the server returns a 200 OK HTTP status code and a response with the status of the operation in JSON format:

200 OK
{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1beta1"
)

// exportDICOMInstance exports DICOM objects to GCS.
func exportDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, destination string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %v", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ExportDicomDataRequest{
		GcsDestination: &healthcare.GoogleCloudHealthcareV1beta1DicomGcsDestination{
			UriPrefix: destination, // "gs://my-bucket/path/to/prefix/"
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Export(name, req).Do()
	if err != nil {
		return fmt.Errorf("Export: %v", err)
	}

	fmt.Fprintf(w, "Export to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.HealthcareQuickstart;
import com.google.api.services.healthcare.v1beta1.model.ExportDicomDataRequest;
import com.google.api.services.healthcare.v1beta1.model.GoogleCloudHealthcareV1beta1DicomGcsDestination;
import com.google.api.services.healthcare.v1beta1.model.Operation;

import java.io.IOException;

public class DicomStoreExport {
  public static void exportDicomStoreInstance(String dicomStoreName, String uriPrefix)
      throws IOException {
    ExportDicomDataRequest exportRequest = new ExportDicomDataRequest();
    GoogleCloudHealthcareV1beta1DicomGcsDestination gcdDestination =
        new GoogleCloudHealthcareV1beta1DicomGcsDestination();
    gcdDestination.setUriPrefix("gs://" + uriPrefix);
    exportRequest.setGcsDestination(gcdDestination);
    Operation exportOperation = HealthcareQuickstart.getCloudHealthcareClient()
        .projects()
        .locations()
        .datasets()
        .dicomStores()
        .export(dicomStoreName, exportRequest)
        .execute();
    System.out.println("Exporting Dicom store op name: " + exportOperation.getName());
  }
}

Node.js

function exportDicomInstanceGcs(
  client,
  projectId,
  cloudRegion,
  datasetId,
  dicomStoreId,
  uriPrefix
) {
  // Token retrieved in callback
  // getToken(serviceAccountJson, function(cb) {...});
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const uriPrefix = 'my-bucket'
  const dicomStoreName = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;

  const request = {
    name: dicomStoreName,
    resource: {
      gcsDestination: {
        uriPrefix: `gs://${uriPrefix}`,
      },
    },
  };

  client.projects.locations.datasets.dicomStores
    .export(request)
    .then(() => {
      console.log(`Exported DICOM instances to bucket ${uriPrefix}`);
    })
    .catch(err => {
      console.error(err);
    });
}

Python

def export_dicom_instance(
        service_account_json,
        project_id,
        cloud_region,
        dataset_id,
        dicom_store_id,
        uri_prefix):
    """Export data to a Google Cloud Storage bucket by copying
    it from the DICOM store."""
    client = get_client(service_account_json)
    dicom_store_parent = 'projects/{}/locations/{}/datasets/{}'.format(
        project_id, cloud_region, dataset_id)
    dicom_store_name = '{}/dicomStores/{}'.format(
        dicom_store_parent, dicom_store_id)

    body = {
        "gcsDestination": {
            "uriPrefix": 'gs://{}'.format(uri_prefix)
        }
    }

    request = client.projects().locations().datasets().dicomStores().export(
        name=dicom_store_name, body=body)

    try:
        response = request.execute()
        print('Exported DICOM instances to bucket: gs://{}'.format(uri_prefix))
        return response
    except HttpError as e:
        print('Error, DICOM instances not exported: {}'.format(e))
        return ""

หน้านี้มีประโยชน์ไหม โปรดแสดงความคิดเห็น