使用 Cloud Storage 导入和导出 DICOM 数据

本页面介绍如何将 DICOM 实例导出到 Cloud Storage 并导入 DICOM 对象。DICOM 实例通常是图片,但可以是其他类型的永久性数据,例如结构化报告。Cloud Storage 对象是位于 Cloud Storage 中的 DICOM 实例。

您可以在 Cloud Storage 存储分区和 DICOM 存储之间导入和导出批量数据。例如,您可能想将多个 DICOM 实例文件导入到 DICOM 存储区中。您可以将数据存储在 Cloud Storage 存储分区中,然后使用单个导入操作将文件导入 DICOM 存储区中,而不是直接以编程方式存储数据。如需了解详情,请参阅 Cloud Storage

要直接存储 DICOM 实例(例如从本地计算机),您可以使用在 Cloud Healthcare API 中实现的存储事务 REST 风格网络服务存储 DICOM 数据。要从 DICOM 存储区检索单个实例或研究,您可以检索 DICOM 数据使用检索事务在 Cloud Healthcare API 中实现的 RESTful 网络服务。

设置 Cloud Storage 权限

在将 DICOM 数据导入 Cloud Storage 以及从 Cloud Storage 导入 DICOM 数据之前,您必须向 Cloud Healthcare Service Agent 服务帐号授予额外权限。有关详情,请参阅 DICOM 存储 Cloud Storage 权限

导入 DICOM 对象

以下示例展示了如何从 Cloud Storage 存储分区导入 DICOM 对象。

控制台

要从 Cloud Storage 存储分区导入 DICOM 对象,请完成以下步骤:

  1. 在 Cloud Console 中,转到数据集页面。

    转到“数据集”页面

  2. 点击要导入 DICOM 对象的数据集。

  3. 在 DICOM 存储区列表中,从操作列表中选择导入

    系统会显示导入 DICOM 存储区页面。

  4. 项目列表中,选择一个 Cloud Storage 项目。

  5. 位置列表中,选择一个 Cloud Storage 存储分区。

  6. 要设置导入文件的特定位置,请执行以下操作:

    1. 展开高级选项
    2. 选择替换 Cloud Storage 路径
    3. 要设置特定的文件导入来源,请在位置文本框中使用以下变量定义路径:
      • * - 匹配非分隔符。
      • ** - 匹配字符,包括分隔符。此名称可与文件扩展名配合使用,以匹配同一类型的所有文件。
      • ? - 匹配 1 个字符。
  7. 点击导入,从定义的来源导入 DICOM 对象。

gcloud

以下示例适用于 Cloud Healthcare API 的 v1beta1 版本。

要从 Cloud Storage 存储分区导入 DICOM 对象,请使用 gcloud beta healthcare dicom-stores import gcs 命令。指定父数据集的名称、DICOM 存储区的名称以及该对象在 Cloud Storage 存储分区中的位置。

  • 文件在存储分区中的位置是任意的,无需严格遵循以下示例中指定的格式。
  • 在 Cloud Storage 中指定 DICOM 资源的位置时,您可以使用通配符从一个或多个目录导入多个文件。支持以下通配符:
    • 使用 * 可以匹配零个或更多的非分隔符字符。例如,gs://BUCKET/DIRECTORY/Example*.dcm 匹配 DIRECTORY中的 example.dcm 和 example22.dcm。
    • 使用 ** 匹配 0 个或多个字符(包括分隔符)。必须在路径末尾使用,且路径中没有其他通配符。也可与扩展名(如.dcm)一起使用,该扩展名可导入指定目录及其子目录中扩展名中的所有文件。例如,gs://BUCKET/DIRECTORY/**.dcm 会导入 DIRECTORY 及其子目录中扩展名为 .dcm 文件名的所有文件。
    • 使用 ? 可以匹配 1 个字符。例如,gs://BUCKET/DIRECTORY/Example?.dcm 匹配示例 1.dcm,但不匹配 Example.dcm 或 Example01.dcm。

以下示例展示了如何从 Cloud Storage 存储分区导入 DICOM 对象。

gcloud beta healthcare dicom-stores import gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm

命令行会显示操作 ID:

name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID

要查看操作的状态,请运行 gcloud beta healthcare operations describe 命令,并提供响应中的 OPERATION_ID

gcloud beta healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID 

命令完成后,响应将包含 done: true

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1beta1.dicom.DicomService.ImportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

要从 Cloud Storage 存储分区导入 DICOM 对象,请使用 projects.locations.datasets.dicomStores.import 方法。

  • 文件在存储分区中的位置是任意的,无需严格遵循以下示例中指定的格式。
  • 在 Cloud Storage 中指定 DICOM 资源的位置时,您可以使用通配符从一个或多个目录导入多个文件。支持以下通配符:
    • 使用 * 可以匹配零个或更多的非分隔符字符。例如,gs://BUCKET/DIRECTORY/Example*.dcm 匹配 DIRECTORY中的 example.dcm 和 example22.dcm。
    • 使用 ** 匹配 0 个或多个字符(包括分隔符)。必须在路径末尾使用,且路径中没有其他通配符。也可与扩展名(如.dcm)一起使用,该扩展名可导入指定目录及其子目录中扩展名中的所有文件。例如,gs://BUCKET/DIRECTORY/**.dcm 会导入 DIRECTORY 及其子目录中扩展名为 .dcm 文件名的所有文件。
    • 使用 ? 可以匹配 1 个字符。例如,gs://BUCKET/DIRECTORY/Example?.dcm 匹配示例 1.dcm,但不匹配 Example.dcm 或 Example01.dcm。

curl 命令

要导入 DICOM 对象,请发出 POST 请求并提供父数据集的名称、DICOM 存储区的名称、对象在 Cloud Storage 存储分区中的位置以及访问令牌。

以下示例显示了使用 curlPOST 请求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsSource': {
        'uri': 'gs://BUCKET/*.dcm'
      }
    }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,您可以使用 Operation get 方法

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter": {
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    },
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

要导入 DICOM 对象,请发出 POST 请求并提供父数据集的名称、DICOM 存储区的名称、对象在 Cloud Storage 存储分区中的位置以及访问令牌。

以下示例显示了使用 Windows PowerShell 的 POST 请求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsSource': {
      'uri': 'gs://BUCKET/*.dcm'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,您可以使用 Operation get 方法

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    }
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1"
)

// importDICOMInstance imports DICOM objects from GCS.
func importDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, contentURI string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %v", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ImportDicomDataRequest{
		GcsSource: &healthcare.GoogleCloudHealthcareV1DicomGcsSource{
			Uri: contentURI,
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Import(name, req).Do()
	if err != nil {
		return fmt.Errorf("Import: %v", err)
	}

	fmt.Fprintf(w, "Import to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpHeaders;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.healthcare.v1.CloudHealthcare;
import com.google.api.services.healthcare.v1.CloudHealthcare.Projects.Locations.Datasets.DicomStores;
import com.google.api.services.healthcare.v1.CloudHealthcareScopes;
import com.google.api.services.healthcare.v1.model.GoogleCloudHealthcareV1DicomGcsSource;
import com.google.api.services.healthcare.v1.model.ImportDicomDataRequest;
import com.google.api.services.healthcare.v1.model.Operation;
import java.io.IOException;
import java.util.Collections;

public class DicomStoreImport {
  private static final String DICOM_NAME = "projects/%s/locations/%s/datasets/%s/dicomStores/%s";
  private static final JsonFactory JSON_FACTORY = new JacksonFactory();
  private static final NetHttpTransport HTTP_TRANSPORT = new NetHttpTransport();

  public static void dicomStoreImport(String dicomStoreName, String gcsUri) throws IOException {
    // String dicomStoreName =
    //    String.format(
    //        DICOM_NAME, "your-project-id", "your-region-id", "your-dataset-id", "your-dicom-id");
    // String gcsUri = "gs://your-bucket-id/path/to/destination/dir"

    // Initialize the client, which will be used to interact with the service.
    CloudHealthcare client = createClient();

    // Configure where the store should be imported from.
    GoogleCloudHealthcareV1DicomGcsSource gcsSource =
        new GoogleCloudHealthcareV1DicomGcsSource().setUri(gcsUri);
    ImportDicomDataRequest importRequest = new ImportDicomDataRequest().setGcsSource(gcsSource);

    // Create request and configure any parameters.
    DicomStores.CloudHealthcareImport request =
        client
            .projects()
            .locations()
            .datasets()
            .dicomStores()
            .healthcareImport(dicomStoreName, importRequest);

    // Execute the request, wait for the operation to complete, and process the results.
    try {
      Operation operation = request.execute();
      while (operation.getDone() == null || !operation.getDone()) {
        // Update the status of the operation with another request.
        Thread.sleep(500); // Pause for 500ms between requests.
        operation =
            client
                .projects()
                .locations()
                .datasets()
                .operations()
                .get(operation.getName())
                .execute();
      }
      System.out.println("DICOM store import complete." + operation.getResponse());
    } catch (Exception ex) {
      System.out.printf("Error during request execution: %s", ex.toString());
      ex.printStackTrace(System.out);
    }
  }

  private static CloudHealthcare createClient() throws IOException {
    // Use Application Default Credentials (ADC) to authenticate the requests
    // For more information see https://cloud.google.com/docs/authentication/production
    GoogleCredential credential =
        GoogleCredential.getApplicationDefault(HTTP_TRANSPORT, JSON_FACTORY)
            .createScoped(Collections.singleton(CloudHealthcareScopes.CLOUD_PLATFORM));

    // Create a HttpRequestInitializer, which will provide a baseline configuration to all requests.
    HttpRequestInitializer requestInitializer =
        request -> {
          credential.initialize(request);
          request.setConnectTimeout(60000); // 1 minute connect timeout
          request.setReadTimeout(60000); // 1 minute read timeout
        };

    // Build the client for interacting with the service.
    return new CloudHealthcare.Builder(HTTP_TRANSPORT, JSON_FACTORY, requestInitializer)
        .setApplicationName("your-application-name")
        .build();
  }
}

Node.js

const {google} = require('googleapis');
const healthcare = google.healthcare('v1');
const sleep = require('../sleep');

const importDicomInstance = async () => {
  const auth = await google.auth.getClient({
    scopes: ['https://www.googleapis.com/auth/cloud-platform'],
  });
  google.options({auth});

  // TODO(developer): uncomment these lines before running the sample
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket/my-directory/*.dcm'
  const name = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;
  const request = {
    name,
    resource: {
      // The location of the DICOM instances in Cloud Storage
      gcsSource: {
        uri: `gs://${gcsUri}`,
      },
    },
  };

  const operation = await healthcare.projects.locations.datasets.dicomStores.import(
    request
  );
  const operationName = operation.data.name;

  const operationRequest = {name: operationName};

  // Wait fifteen seconds for the LRO to finish.
  await sleep(15000);

  // Check the LRO's status
  const operationStatus = await healthcare.projects.locations.datasets.operations.get(
    operationRequest
  );

  const {data} = operationStatus;

  if (data.error === undefined) {
    console.log('Successfully imported DICOM instances');
  } else {
    console.log('Encountered errors. Sample error:');
    console.log(
      'Resource on which error occured:',
      data.error.details[0]['sampleErrors'][0]['resource']
    );
    console.log(
      'Error code:',
      data.error.details[0]['sampleErrors'][0]['error']['code']
    );
    console.log(
      'Error message:',
      data.error.details[0]['sampleErrors'][0]['error']['message']
    );
  }
};

importDicomInstance();

Python

def import_dicom_instance(
    project_id, cloud_region, dataset_id, dicom_store_id, content_uri
):
    """Import data into the DICOM store by copying it from the specified
    source.
    """
    client = get_client()
    dicom_store_parent = "projects/{}/locations/{}/datasets/{}".format(
        project_id, cloud_region, dataset_id
    )
    dicom_store_name = "{}/dicomStores/{}".format(dicom_store_parent, dicom_store_id)

    body = {"gcsSource": {"uri": "gs://{}".format(content_uri)}}
    # Escape "import()" method keyword because "import"
    # is a reserved keyword in Python
    request = (
        client.projects()
        .locations()
        .datasets()
        .dicomStores()
        .import_(name=dicom_store_name, body=body)
    )

    response = request.execute()
    print("Imported DICOM instance: {}".format(content_uri))

    return response

对 DICOM 导入请求进行问题排查

如果在执行 DICOM 导入请求期间发生错误,系统会将错误记录到 Cloud Logging。如需了解详情,请参阅在 Cloud Logging 中查看错误日志

导出 DICOM 实例

以下示例展示了如何将 DICOM 实例导出到 Cloud Storage 存储分区。从 DICOM 存储区导出 DICOM 实例时,将导出该存储区中的所有实例。

控制台

要将 DICOM 实例导出到 Cloud Storage,请完成以下步骤:

  1. 在 Cloud Console 中,转到数据集页面。

    转到“数据集”页面

  2. 点击要为其导出 DICOM 实例的数据集。

  3. 在 DICOM 存储区列表中,从操作列表中选择导出

    此时将显示导出 DICOM 存储区页面。

  4. 选择 Google Cloud Storage 存储分区

  5. 项目列表中,选择一个 Cloud Storage 项目。

  6. 位置列表中,选择一个 Cloud Storage 存储分区。

  7. DICOM 导出设置中,选择用于导出 DICOM 实例的文件类型。您可以选择以下类型:

    • DICOM 文件(.dcm
    • 八位元字符串流
    • 图像(.jpg.png
  8. 要定义其他转移语法,请从转移语法列表中选择语法。

  9. 点击导出,将 DICOM 实例导出到 Cloud Storage 中的指定位置。

gcloud

以下示例适用于 Cloud Healthcare API 的 v1beta1 版本。

要将 DICOM 实例导出到 Cloud Storage 存储分区,请使用 gcloud beta healthcare dicom-stores export gcs 命令。

  • 提供父数据集的名称、DICOM 存储区的名称和目标 Cloud Storage 存储分区。
  • 写入 Cloud Storage 存储分区或目录(而非对象),因为 Cloud Healthcare API 会为每个对象创建一个.dcm文件。
  • 如果该命令指定的目录不存在,则会创建该目录。

以下示例展示了 gcloud beta healthcare dicom-stores export gcs 命令。

gcloud beta healthcare dicom-stores export gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri-prefix=gs://BUCKET/DIRECTORY 

命令行会显示操作 ID:

name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID

要查看操作的状态,请运行 gcloud beta healthcare operations describe 命令,并提供响应中的 OPERATION_ID

gcloud beta healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID 

命令完成后,响应将包含 done: true

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

要将 DICOM 实例导出到 Cloud Storage 存储分区,请使用 projects.locations.datasets.dicomStores.export 方法。

  • 写入 Cloud Storage 存储分区或目录(而非对象),因为 Cloud Healthcare API 会为每个 DICOM 对象创建一个 .dcm 文件。
  • 如果该命令指定的目录不存在,则会创建该目录。

curl 命令

要导出 DICOM 实例,请发出 POST 请求并提供父数据集的名称、DICOM 存储区的名称、目标 Cloud Storage 存储分区以及访问令牌。

以下示例显示了使用 curlPOST 请求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsDestination': {
        'uriPrefix': 'gs://BUCKET/DIRECTORY'
      }
    }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,您可以使用 Operation get 方法

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    }
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

要导出 DICOM 实例,请发出 POST 请求并提供父数据集的名称、DICOM 存储区的名称、目标 Cloud Storage 存储分区以及访问令牌。

以下示例显示了使用 Windows PowerShell 的 POST 请求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsDestination': {
      'uriPrefix': 'gs://BUCKET/DIRECTORY'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,您可以使用 Operation get 方法

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    },
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1"
)

// exportDICOMInstance exports DICOM objects to GCS.
func exportDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, destination string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %v", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ExportDicomDataRequest{
		GcsDestination: &healthcare.GoogleCloudHealthcareV1DicomGcsDestination{
			UriPrefix: destination, // "gs://my-bucket/path/to/prefix/"
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Export(name, req).Do()
	if err != nil {
		return fmt.Errorf("Export: %v", err)
	}

	fmt.Fprintf(w, "Export to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpHeaders;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.healthcare.v1.CloudHealthcare;
import com.google.api.services.healthcare.v1.CloudHealthcare.Projects.Locations.Datasets.DicomStores;
import com.google.api.services.healthcare.v1.CloudHealthcareScopes;
import com.google.api.services.healthcare.v1.model.ExportDicomDataRequest;
import com.google.api.services.healthcare.v1.model.GoogleCloudHealthcareV1DicomGcsDestination;
import com.google.api.services.healthcare.v1.model.Operation;
import java.io.IOException;
import java.util.Collections;

public class DicomStoreExport {
  private static final String DICOM_NAME = "projects/%s/locations/%s/datasets/%s/dicomStores/%s";
  private static final JsonFactory JSON_FACTORY = new JacksonFactory();
  private static final NetHttpTransport HTTP_TRANSPORT = new NetHttpTransport();

  public static void dicomStoreExport(String dicomStoreName, String gcsUri) throws IOException {
    // String dicomStoreName =
    //    String.format(
    //        DICOM_NAME, "your-project-id", "your-region-id", "your-dataset-id", "your-dicom-id");
    // String gcsUri = "gs://your-bucket-id/path/to/destination/dir"

    // Initialize the client, which will be used to interact with the service.
    CloudHealthcare client = createClient();

    // Configure where the store will be exported too.
    GoogleCloudHealthcareV1DicomGcsDestination gcsDestination =
        new GoogleCloudHealthcareV1DicomGcsDestination().setUriPrefix(gcsUri);
    ExportDicomDataRequest exportRequest =
        new ExportDicomDataRequest().setGcsDestination(gcsDestination);

    // Create request and configure any parameters.
    DicomStores.Export request =
        client
            .projects()
            .locations()
            .datasets()
            .dicomStores()
            .export(dicomStoreName, exportRequest);

    // Execute the request, wait for the operation to complete, and process the results.
    try {
      Operation operation = request.execute();
      while (operation.getDone() == null || !operation.getDone()) {
        // Update the status of the operation with another request.
        Thread.sleep(500); // Pause for 500ms between requests.
        operation =
            client
                .projects()
                .locations()
                .datasets()
                .operations()
                .get(operation.getName())
                .execute();
      }
      System.out.println("DICOM store export complete." + operation.getResponse());
    } catch (Exception ex) {
      System.out.printf("Error during request execution: %s", ex.toString());
      ex.printStackTrace(System.out);
    }
  }

  private static CloudHealthcare createClient() throws IOException {
    // Use Application Default Credentials (ADC) to authenticate the requests
    // For more information see https://cloud.google.com/docs/authentication/production
    GoogleCredential credential =
        GoogleCredential.getApplicationDefault(HTTP_TRANSPORT, JSON_FACTORY)
            .createScoped(Collections.singleton(CloudHealthcareScopes.CLOUD_PLATFORM));

    // Create a HttpRequestInitializer, which will provide a baseline configuration to all requests.
    HttpRequestInitializer requestInitializer =
        request -> {
          credential.initialize(request);
          request.setConnectTimeout(60000); // 1 minute connect timeout
          request.setReadTimeout(60000); // 1 minute read timeout
        };

    // Build the client for interacting with the service.
    return new CloudHealthcare.Builder(HTTP_TRANSPORT, JSON_FACTORY, requestInitializer)
        .setApplicationName("your-application-name")
        .build();
  }
}

Node.js

const {google} = require('googleapis');
const healthcare = google.healthcare('v1');

const exportDicomInstanceGcs = async () => {
  const auth = await google.auth.getClient({
    scopes: ['https://www.googleapis.com/auth/cloud-platform'],
  });
  google.options({auth});

  // TODO(developer): uncomment these lines before running the sample
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket/my-directory'
  const name = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;
  const request = {
    name,
    resource: {
      gcsDestination: {
        // The destination location of the DICOM instances in Cloud Storage
        uriPrefix: `gs://${gcsUri}`,
        // The format to use for the output files, per the MIME types supported in the DICOM spec
        mimeType: 'application/dicom',
      },
    },
  };

  await healthcare.projects.locations.datasets.dicomStores.export(request);
  console.log(`Exported DICOM instances to ${gcsUri}`);
};

exportDicomInstanceGcs();

Python

def export_dicom_instance(
    project_id, cloud_region, dataset_id, dicom_store_id, uri_prefix
):
    """Export data to a Google Cloud Storage bucket by copying
    it from the DICOM store."""
    client = get_client()
    dicom_store_parent = "projects/{}/locations/{}/datasets/{}".format(
        project_id, cloud_region, dataset_id
    )
    dicom_store_name = "{}/dicomStores/{}".format(dicom_store_parent, dicom_store_id)

    body = {"gcsDestination": {"uriPrefix": "gs://{}".format(uri_prefix)}}
    request = (
        client.projects()
        .locations()
        .datasets()
        .dicomStores()
        .export(name=dicom_store_name, body=body)
    )

    response = request.execute()
    print("Exported DICOM instances to bucket: gs://{}".format(uri_prefix))

    return response

对 DICOM 导出请求进行问题排查

如果在执行 DICOM 导出请求期间发生错误,系统会将错误记录到 Cloud Logging。如需了解详情,请参阅在 Cloud Logging 中查看错误日志