使用 Cloud Storage 导入和导出 DICOM 数据

本页面介绍如何将 DICOM 实例导出到 Cloud Storage 并导入 DICOM 对象。DICOM 实例通常是图片,但可以是其他类型的永久性数据,例如结构化报告。Cloud Storage 对象是位于 Cloud Storage 中的 DICOM 实例。

您可以在 Cloud Storage 存储分区和 DICOM 存储之间导入和导出批量数据。例如,您可能想将多个 DICOM 实例文件导入到 DICOM 存储区中。而不是以编程方式直接存储数据来将数据存储在 Cloud Storage 存储分区中,然后使用单个导入操作将文件导入 DICOM 存储区。如需了解详情,请参阅 Cloud Storage

如需直接存储 DICOM 实例(例如从本地机器),请使用 Cloud Healthcare API 中实现的存储事务 RESTful Web 服务存储 DICOM 数据。如需从 DICOM 存储区检索单个实例或研究,请使用已在 Cloud Healthcare API 中实现的检索事务 RESTful Web 服务检索 DICOM 数据

设置 Cloud Storage 权限

在将 DICOM 数据导入 Cloud Storage 以及从 Cloud Storage 导入 DICOM 数据之前,您必须向 Cloud Healthcare Service Agent 服务帐号授予额外权限。有关详情,请参阅 DICOM 存储 Cloud Storage 权限

导入 DICOM 对象

以下示例展示了如何从 Cloud Storage 存储分区导入 DICOM 对象。

控制台

要从 Cloud Storage 存储分区导入 DICOM 对象,请完成以下步骤:

  1. 在 Cloud Console 中,转到数据集页面。
    转到“数据集”页面
  2. 点击包含您要导入 DICOM 对象的 DICOM 存储区的数据集。
  3. 在数据存储区列表中,选择导入操作 DICOM 存储区的列表。

    导入到 DICOM 存储区页面会出现。
  4. 项目列表中,选择一个 Cloud Storage 项目。
  5. 位置列表中,选择一个 Cloud Storage 存储分区。
  6. 要设置导入文件的特定位置,请执行以下操作:
    1. 展开高级选项
    2. 选择替换 Cloud Storage 路径
    3. 要设置特定的文件导入来源,请在位置文本框中使用以下变量定义路径:
      • * - 匹配非分隔符。
      • ** - 匹配字符,包括分隔符。 此名称可与文件扩展名配合使用,以匹配同一类型的所有文件。
      • ? - 匹配 1 个字符。
  7. 点击导入,从定义的来源导入 DICOM 对象。
  8. 要跟踪操作的状态,请点击操作标签页。操作完成后,系统会显示以下指示:
    • 长时间运行的操作状态部分下方的确定标题下会显示一个绿色的对勾标记。
    • 概览部分在操作 ID 的同一行中显示一个绿色对勾标记和一个确定指示符。
    如果您遇到任何错误,请点击操作,然后点击在 Cloud Logging 中查看详细信息

gcloud

要从 Cloud Storage 存储分区导入 DICOM 对象,请使用 gcloud healthcare dicom-stores import gcs 命令。指定父数据集的名称、DICOM 存储区的名称以及该对象在 Cloud Storage 存储分区中的位置。

  • 文件在存储分区中的位置是任意的,无需严格遵循以下示例中指定的格式。
  • 在 Cloud Storage 中指定 DICOM 资源的位置时,您可以使用通配符从一个或多个目录导入多个文件。支持以下通配符:
    • 使用 * 可以匹配零个或更多的非分隔符字符。例如,gs://BUCKET/DIRECTORY/Example*.dcm 匹配 DIRECTORY中的 example.dcm 和 example22.dcm。
    • 使用 ** 匹配 0 个或多个字符(包括分隔符)。必须在路径末尾使用,且路径中没有其他通配符。也可与扩展名(如.dcm)一起使用,该扩展名可导入指定目录及其子目录中扩展名中的所有文件。例如,gs://BUCKET/DIRECTORY/**.dcm 会导入 DIRECTORY 及其子目录中扩展名为 .dcm 文件名的所有文件。
    • 使用 ? 可以匹配 1 个字符。例如,gs://BUCKET/DIRECTORY/Example?.dcm 匹配示例 1.dcm,但不匹配 Example.dcm 或 Example01.dcm。

以下示例展示了如何从 Cloud Storage 存储分区导入 DICOM 对象。

gcloud healthcare dicom-stores import gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm

命令行会显示操作 ID:

name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID

如需查看操作的状态,请运行 gcloud healthcare operations describe 命令并从响应中提供 OPERATION_ID

gcloud healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID

命令完成后,响应将包含 done: true

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

要从 Cloud Storage 存储分区导入 DICOM 对象,请使用 projects.locations.datasets.dicomStores.import 方法。

  • 文件在存储分区中的位置是任意的,无需严格遵循以下示例中指定的格式。
  • 在 Cloud Storage 中指定 DICOM 对象的位置时,请使用通配符从一个或多个目录导入多个文件。 支持以下通配符:
    • 使用 * 可以匹配零个或更多的非分隔符字符。例如,gs://BUCKET/DIRECTORY/Example*.dcm 匹配 DIRECTORY中的 example.dcm 和 example22.dcm。
    • 使用 ** 匹配 0 个或多个字符(包括分隔符)。必须在路径末尾使用,且路径中没有其他通配符。也可与扩展名(如.dcm)一起使用,该扩展名可导入指定目录及其子目录中扩展名中的所有文件。例如,gs://BUCKET/DIRECTORY/**.dcm 会导入 DIRECTORY 及其子目录中扩展名为 .dcm 文件名的所有文件。
    • 使用 ? 可以匹配 1 个字符。例如,gs://BUCKET/DIRECTORY/Example?.dcm 匹配示例 1.dcm,但不匹配 Example.dcm 或 Example01.dcm。

curl

如需导入 DICOM 对象,请发出 POST 请求并提供以下信息:

  • 父级数据集的名称和位置
  • DICOM 存储区的名称
  • 对象在 Cloud Storage 存储分区中的位置

以下示例展示了使用 curlPOST 请求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsSource': {
        'uri': 'gs://BUCKET/*.dcm'
      }
    }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,请使用操作 get 方法

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter": {
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    },
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

如需导入 DICOM 对象,请发出 POST 请求并提供以下信息:

  • 父级数据集的名称和位置
  • DICOM 存储区的名称
  • 对象在 Cloud Storage 存储分区中的位置

以下示例展示了使用 Windows PowerShell 的 POST 请求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsSource': {
      'uri': 'gs://BUCKET/*.dcm'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,请使用操作 get 方法

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    }
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1"
)

// importDICOMInstance imports DICOM objects from GCS.
func importDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, contentURI string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %v", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ImportDicomDataRequest{
		GcsSource: &healthcare.GoogleCloudHealthcareV1DicomGcsSource{
			Uri: contentURI,
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Import(name, req).Do()
	if err != nil {
		return fmt.Errorf("Import: %v", err)
	}

	fmt.Fprintf(w, "Import to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.healthcare.v1.CloudHealthcare;
import com.google.api.services.healthcare.v1.CloudHealthcare.Projects.Locations.Datasets.DicomStores;
import com.google.api.services.healthcare.v1.CloudHealthcareScopes;
import com.google.api.services.healthcare.v1.model.GoogleCloudHealthcareV1DicomGcsSource;
import com.google.api.services.healthcare.v1.model.ImportDicomDataRequest;
import com.google.api.services.healthcare.v1.model.Operation;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Collections;

public class DicomStoreImport {
  private static final String DICOM_NAME = "projects/%s/locations/%s/datasets/%s/dicomStores/%s";
  private static final JsonFactory JSON_FACTORY = new JacksonFactory();
  private static final NetHttpTransport HTTP_TRANSPORT = new NetHttpTransport();

  public static void dicomStoreImport(String dicomStoreName, String gcsUri) throws IOException {
    // String dicomStoreName =
    //    String.format(
    //        DICOM_NAME, "your-project-id", "your-region-id", "your-dataset-id", "your-dicom-id");
    // String gcsUri = "gs://your-bucket-id/path/to/destination/dir"

    // Initialize the client, which will be used to interact with the service.
    CloudHealthcare client = createClient();

    // Configure where the store should be imported from.
    GoogleCloudHealthcareV1DicomGcsSource gcsSource =
        new GoogleCloudHealthcareV1DicomGcsSource().setUri(gcsUri);
    ImportDicomDataRequest importRequest = new ImportDicomDataRequest().setGcsSource(gcsSource);

    // Create request and configure any parameters.
    DicomStores.CloudHealthcareImport request =
        client
            .projects()
            .locations()
            .datasets()
            .dicomStores()
            .healthcareImport(dicomStoreName, importRequest);

    // Execute the request, wait for the operation to complete, and process the results.
    try {
      Operation operation = request.execute();
      while (operation.getDone() == null || !operation.getDone()) {
        // Update the status of the operation with another request.
        Thread.sleep(500); // Pause for 500ms between requests.
        operation =
            client
                .projects()
                .locations()
                .datasets()
                .operations()
                .get(operation.getName())
                .execute();
      }
      System.out.println("DICOM store import complete." + operation.getResponse());
    } catch (Exception ex) {
      System.out.printf("Error during request execution: %s", ex.toString());
      ex.printStackTrace(System.out);
    }
  }

  private static CloudHealthcare createClient() throws IOException {
    // Use Application Default Credentials (ADC) to authenticate the requests
    // For more information see https://cloud.google.com/docs/authentication/production
    GoogleCredentials credential =
        GoogleCredentials.getApplicationDefault()
            .createScoped(Collections.singleton(CloudHealthcareScopes.CLOUD_PLATFORM));

    // Create a HttpRequestInitializer, which will provide a baseline configuration to all requests.
    HttpRequestInitializer requestInitializer =
        request -> {
          new HttpCredentialsAdapter(credential).initialize(request);
          request.setConnectTimeout(60000); // 1 minute connect timeout
          request.setReadTimeout(60000); // 1 minute read timeout
        };

    // Build the client for interacting with the service.
    return new CloudHealthcare.Builder(HTTP_TRANSPORT, JSON_FACTORY, requestInitializer)
        .setApplicationName("your-application-name")
        .build();
  }
}

Node.js

const {google} = require('googleapis');
const healthcare = google.healthcare('v1');
const sleep = require('../sleep');

const importDicomInstance = async () => {
  const auth = await google.auth.getClient({
    scopes: ['https://www.googleapis.com/auth/cloud-platform'],
  });
  google.options({auth});

  // TODO(developer): uncomment these lines before running the sample
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket/my-directory/*.dcm'
  const name = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;
  const request = {
    name,
    resource: {
      // The location of the DICOM instances in Cloud Storage
      gcsSource: {
        uri: `gs://${gcsUri}`,
      },
    },
  };

  const operation = await healthcare.projects.locations.datasets.dicomStores.import(
    request
  );
  const operationName = operation.data.name;

  const operationRequest = {name: operationName};

  // Wait fifteen seconds for the LRO to finish.
  await sleep(15000);

  // Check the LRO's status
  const operationStatus = await healthcare.projects.locations.datasets.operations.get(
    operationRequest
  );

  const {data} = operationStatus;

  if (data.error === undefined) {
    console.log('Successfully imported DICOM instances');
  } else {
    console.log('Encountered errors. Sample error:');
    console.log(
      'Resource on which error occured:',
      data.error.details[0]['sampleErrors'][0]['resource']
    );
    console.log(
      'Error code:',
      data.error.details[0]['sampleErrors'][0]['error']['code']
    );
    console.log(
      'Error message:',
      data.error.details[0]['sampleErrors'][0]['error']['message']
    );
  }
};

importDicomInstance();

Python

def import_dicom_instance(
    project_id, cloud_region, dataset_id, dicom_store_id, content_uri
):
    """Import data into the DICOM store by copying it from the specified
    source.
    """
    client = get_client()
    dicom_store_parent = "projects/{}/locations/{}/datasets/{}".format(
        project_id, cloud_region, dataset_id
    )
    dicom_store_name = "{}/dicomStores/{}".format(dicom_store_parent, dicom_store_id)

    body = {"gcsSource": {"uri": "gs://{}".format(content_uri)}}

    # Escape "import()" method keyword because "import"
    # is a reserved keyword in Python
    request = (
        client.projects()
        .locations()
        .datasets()
        .dicomStores()
        .import_(name=dicom_store_name, body=body)
    )

    response = request.execute()
    print("Imported DICOM instance: {}".format(content_uri))

    return response

对 DICOM 导入请求进行问题排查

如果在执行 DICOM 导入请求期间发生错误,系统会将错误记录到 Cloud Logging。如需了解详情,请参阅在 Cloud Logging 中查看错误日志

导出 DICOM 实例

以下示例展示了如何将 DICOM 实例导出到 Cloud Storage 存储分区。从 DICOM 存储区导出 DICOM 实例时,将导出该存储区中的所有实例。

控制台

要将 DICOM 实例导出到 Cloud Storage,请完成以下步骤:

  1. 在 Cloud Console 中,转到数据集页面。
    转到“数据集”页面
  2. 点击包含您从中导出 DICOM 实例的数据集。
  3. 在数据存储区列表中,从 DICOM 存储区的操作列表中选择导出
  4. 在出现的导出 DICOM 存储区页面上,选择 Google Cloud Storage 存储分区
  5. 项目列表中,选择一个 Cloud Storage 项目。
  6. 位置列表中,选择一个 Cloud Storage 存储分区。
  7. DICOM 导出设置中,选择用于导出 DICOM 实例的文件类型。您可以选择以下类型:
    • DICOM 文件(.dcm
    • 八位元字符串流
    • 映像(.jpg.png
  8. 要定义其他转移语法,请从转移语法列表中选择语法。
  9. 点击导出,将 DICOM 实例导出到 Cloud Storage 中的指定位置。
  10. 要跟踪操作的状态,请点击操作标签页。操作完成后,系统会显示以下指示:
    • 长时间运行的操作状态部分下方的确定标题下会显示一个绿色的对勾标记。
    • 概览部分在操作 ID 的同一行中显示一个绿色对勾标记和一个确定指示符。
    如果您遇到任何错误,请点击操作,然后点击在 Cloud Logging 中查看详细信息

gcloud

要将 DICOM 实例导出到 Cloud Storage 存储分区,请使用 gcloud healthcare dicom-stores export gcs 命令。

  • 提供父数据集的名称、DICOM 存储区的名称和目标 Cloud Storage 存储分区。
  • 写入 Cloud Storage 存储分区或目录(而非对象),因为 Cloud Healthcare API 会为每个对象创建一个.dcm文件。
  • 如果该命令指定的目录不存在,则会创建该目录。

以下示例展示了 gcloud healthcare dicom-stores export gcs 命令。

gcloud healthcare dicom-stores export gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri-prefix=gs://BUCKET/DIRECTORY

命令行会显示操作 ID:

name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID

如需查看操作的状态,请运行 gcloud healthcare operations describe 命令并从响应中提供 OPERATION_ID

gcloud healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID

命令完成后,响应将包含 done: true

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

要将 DICOM 实例导出到 Cloud Storage 存储分区,请使用 projects.locations.datasets.dicomStores.export 方法。

  • 写入 Cloud Storage 存储分区或目录(而非对象),因为 Cloud Healthcare API 会为每个 DICOM 对象创建一个 .dcm 文件。
  • 如果该命令指定的目录不存在,则会创建该目录。

curl

如需导出 DICOM 实例,请发出 POST 请求并提供以下信息:

  • 父级数据集的名称和位置
  • DICOM 存储区的名称
  • 目标 Cloud Storage 存储分区

以下示例展示了使用 curlPOST 请求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsDestination': {
        'uriPrefix': 'gs://BUCKET/DIRECTORY'
      }
    }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,请使用操作 get 方法

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    }
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

如需导出 DICOM 实例,请发出 POST 请求并提供以下信息:

  • 父级数据集的名称和位置
  • DICOM 存储区的名称
  • 目标 Cloud Storage 存储分区

以下示例展示了使用 Windows PowerShell 的 POST 请求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsDestination': {
      'uriPrefix': 'gs://BUCKET/DIRECTORY'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。要跟踪操作的状态,请使用操作 get 方法

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content

如果请求成功,服务器将以 JSON 格式返回包含操作状态的响应:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    },
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1"
)

// exportDICOMInstance exports DICOM objects to GCS.
func exportDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, destination string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %v", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ExportDicomDataRequest{
		GcsDestination: &healthcare.GoogleCloudHealthcareV1DicomGcsDestination{
			UriPrefix: destination, // "gs://my-bucket/path/to/prefix/"
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Export(name, req).Do()
	if err != nil {
		return fmt.Errorf("Export: %v", err)
	}

	fmt.Fprintf(w, "Export to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.healthcare.v1.CloudHealthcare;
import com.google.api.services.healthcare.v1.CloudHealthcare.Projects.Locations.Datasets.DicomStores;
import com.google.api.services.healthcare.v1.CloudHealthcareScopes;
import com.google.api.services.healthcare.v1.model.ExportDicomDataRequest;
import com.google.api.services.healthcare.v1.model.GoogleCloudHealthcareV1DicomGcsDestination;
import com.google.api.services.healthcare.v1.model.Operation;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Collections;

public class DicomStoreExport {
  private static final String DICOM_NAME = "projects/%s/locations/%s/datasets/%s/dicomStores/%s";
  private static final JsonFactory JSON_FACTORY = new JacksonFactory();
  private static final NetHttpTransport HTTP_TRANSPORT = new NetHttpTransport();

  public static void dicomStoreExport(String dicomStoreName, String gcsUri) throws IOException {
    // String dicomStoreName =
    //    String.format(
    //        DICOM_NAME, "your-project-id", "your-region-id", "your-dataset-id", "your-dicom-id");
    // String gcsUri = "gs://your-bucket-id/path/to/destination/dir"

    // Initialize the client, which will be used to interact with the service.
    CloudHealthcare client = createClient();

    // Configure where the store will be exported too.
    GoogleCloudHealthcareV1DicomGcsDestination gcsDestination =
        new GoogleCloudHealthcareV1DicomGcsDestination().setUriPrefix(gcsUri);
    ExportDicomDataRequest exportRequest =
        new ExportDicomDataRequest().setGcsDestination(gcsDestination);

    // Create request and configure any parameters.
    DicomStores.Export request =
        client
            .projects()
            .locations()
            .datasets()
            .dicomStores()
            .export(dicomStoreName, exportRequest);

    // Execute the request, wait for the operation to complete, and process the results.
    try {
      Operation operation = request.execute();
      while (operation.getDone() == null || !operation.getDone()) {
        // Update the status of the operation with another request.
        Thread.sleep(500); // Pause for 500ms between requests.
        operation =
            client
                .projects()
                .locations()
                .datasets()
                .operations()
                .get(operation.getName())
                .execute();
      }
      System.out.println("DICOM store export complete." + operation.getResponse());
    } catch (Exception ex) {
      System.out.printf("Error during request execution: %s", ex.toString());
      ex.printStackTrace(System.out);
    }
  }

  private static CloudHealthcare createClient() throws IOException {
    // Use Application Default Credentials (ADC) to authenticate the requests
    // For more information see https://cloud.google.com/docs/authentication/production
    GoogleCredentials credential =
        GoogleCredentials.getApplicationDefault()
            .createScoped(Collections.singleton(CloudHealthcareScopes.CLOUD_PLATFORM));

    // Create a HttpRequestInitializer, which will provide a baseline configuration to all requests.
    HttpRequestInitializer requestInitializer =
        request -> {
          new HttpCredentialsAdapter(credential).initialize(request);
          request.setConnectTimeout(60000); // 1 minute connect timeout
          request.setReadTimeout(60000); // 1 minute read timeout
        };

    // Build the client for interacting with the service.
    return new CloudHealthcare.Builder(HTTP_TRANSPORT, JSON_FACTORY, requestInitializer)
        .setApplicationName("your-application-name")
        .build();
  }
}

Node.js

const {google} = require('googleapis');
const healthcare = google.healthcare('v1');

const exportDicomInstanceGcs = async () => {
  const auth = await google.auth.getClient({
    scopes: ['https://www.googleapis.com/auth/cloud-platform'],
  });
  google.options({auth});

  // TODO(developer): uncomment these lines before running the sample
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket/my-directory'
  const name = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;
  const request = {
    name,
    resource: {
      gcsDestination: {
        // The destination location of the DICOM instances in Cloud Storage
        uriPrefix: `gs://${gcsUri}`,
        // The format to use for the output files, per the MIME types supported in the DICOM spec
        mimeType: 'application/dicom',
      },
    },
  };

  await healthcare.projects.locations.datasets.dicomStores.export(request);
  console.log(`Exported DICOM instances to ${gcsUri}`);
};

exportDicomInstanceGcs();

Python

def export_dicom_instance(
    project_id, cloud_region, dataset_id, dicom_store_id, uri_prefix
):
    """Export data to a Google Cloud Storage bucket by copying
    it from the DICOM store."""
    client = get_client()
    dicom_store_parent = "projects/{}/locations/{}/datasets/{}".format(
        project_id, cloud_region, dataset_id
    )
    dicom_store_name = "{}/dicomStores/{}".format(dicom_store_parent, dicom_store_id)

    body = {"gcsDestination": {"uriPrefix": "gs://{}".format(uri_prefix)}}

    request = (
        client.projects()
        .locations()
        .datasets()
        .dicomStores()
        .export(name=dicom_store_name, body=body)
    )

    response = request.execute()
    print("Exported DICOM instances to bucket: gs://{}".format(uri_prefix))

    return response

使用过滤条件导出 DICOM 实例

默认情况下,将 DICOM 数据导出到 Cloud Storage 时,系统会导出指定 DICOM 存储区中的所有 DICOM 文件。同样,将 DICOM 元数据导出到 BigQuery 时,系统会导出指定 DICOM 存储区中所有 DICOM 数据的元数据。

您可以使用过滤条件导出部分 DICOM 数据或元数据。您可以在过滤条件文件中定义过滤条件。

配置过滤条件文件

过滤条件文件指定要导出到 Cloud Storage 或 BigQuery 的 DICOM 文件。您可以在以下级别配置过滤条件文件:

  • 在研究级别
  • 在系列级别
  • 在实例级别

过滤条件文件由多行组成,其中每一行定义要导出的研究、系列或实例。每行都使用 /studies/STUDY_UID[/series/SERIES_UID[/instances/INSTANCE_UID]] 格式。

如果在传入过滤条件文件时未在过滤条件文件中指定研究、系列或实例,则系统将不会导出该研究、系列或实例。

只需要提供路径的 /studies/STUDY_UID 部分。您可以通过指定 /studies/STUDY_UID 导出整个研究,也可以通过指定 /studies/STUDY_UID/series/SERIES_UID 导出整个系列。

请考虑以下过滤条件文件。该过滤文件将导出一个研究、两个系列和三个单独的实例:

/studies/1.123.456.789
/studies/1.666.333.111/series/123.456
/studies/1.666.333.111/series/567.890
/studies/1.888.999.222/series/123.456/instances/111
/studies/1.888.999.222/series/123.456/instances/222
/studies/1.888.999.222/series/123.456/instances/333

使用 BigQuery 创建过滤条件文件

如需创建过滤条件文件,您通常需要先将 DICOM 存储区中的元数据导出到 BigQuery。这样,您就可以使用 BigQuery 来查看 DICOM 存储区中 DICOM 数据的研究、系列和实例 UID。然后,您可以完成以下步骤:

  1. 查询您感兴趣的研究、系列和实例 UID。例如,将 DICOM 元数据导出到 BigQuery 后,您可以运行以下查询,将研究、系列和实例 UID 串接为符合过滤条件文件要求的格式:
    SELECT CONCAT
        ('/studies/', StudyInstanceUID, '/series/', SeriesInstanceUID, '/instances/', SOPInstanceUID)
    FROM
        [PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE]
    
  2. 如果查询返回大型结果集,您可以通过将查询结果保存到 BigQuery 中的目标表将新表具体化。
  3. 如果您已将查询结果保存到目标表,则可以将目标表的内容保存到文件,并将文件导出到 Cloud Storage。如需了解如何执行此操作,请参阅导出表数据。导出的文件就是您的过滤条件文件。在导出操作中指定过滤条件时,您将使用过滤条件文件在 Cloud Storage 中的位置。

手动创建过滤条件文件

您可以创建包含自定义内容的过滤条件文件,然后将其上传到 Cloud Storage 存储分区。在导出操作中指定过滤条件时,您将使用过滤条件文件在 Cloud Storage 中的位置。以下示例展示了如何使用 gsutil cp 命令将过滤条件文件上传到 Cloud Storage 存储分区:
gsutil cp PATH/TO/FILTER_FILE gs://BUCKET/DIRECTORY

传入过滤条件文件

创建过滤条件文件后,请调用 DICOM 导出操作,并使用 REST API 传入过滤条件文件。以下示例展示了如何使用过滤条件导出 DICOM 数据。

gcloud

如需使用过滤条件将 DICOM 元数据导出到 Cloud Storage,请使用 gcloud beta healthcare dicom-stores export gcs 命令:

gcloud beta healthcare dicom-stores export gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri-prefix=gs://DESTINATION_BUCKET/DIRECTORY \
  --filter-config-gcs-uri=gs://BUCKET/DIRECTORY/FILTER_FILE

请替换以下内容:

  • DICOM_STORE_ID:DICOM 存储区的标识符
  • DATASET_ID:DICOM 存储区的父级数据集的名称
  • LOCATION:DICOM 存储区的父级数据集的位置
  • DESTINATION_BUCKET/DIRECTORY:目标 Cloud Storage 存储分区
  • BUCKET/DIRECTORY/FILTER_FILE:过滤条件文件在 Cloud Storage 存储分区中的位置

输出如下所示:

Request issued for: [DICOM_STORE_ID]
Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID

如需查看操作的状态,请运行 gcloud healthcare operations describe 命令并从响应中提供 OPERATION_ID

gcloud healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID

请替换以下内容:

  • OPERATION_ID:从上一条响应中返回的 ID 编号
  • DATASET_ID:DICOM 存储区的父级数据集的名称
  • LOCATION:DICOM 存储区的父级数据集的位置

输出如下所示:

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: 'CREATE_TIME'
endTime: 'END_TIME'
logsUrl: 'https://console.cloud.google.com/logs/viewer/CLOUD_LOGGING_URL'
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': '...'

API

如需使用过滤条件导出 DICOM 数据,请使用 projects.locations.datasets.dicomStores.export 方法。

curl

如需使用过滤条件文件导出 DICOM 数据,请发出 POST 请求并提供以下信息:

  • 父级数据集的名称和位置
  • DICOM 存储区的名称
  • 目标 Cloud Storage 存储分区
  • 过滤条件文件在 Cloud Storage 存储分区中的位置

以下示例展示了使用 curlPOST 请求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsDestination': {
        'uriPrefix': 'gs://BUCKET/DIRECTORY'
      },
      'filterConfig': {
        'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE'
      }
    }" "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"

如果请求成功,则服务器会返回 JSON 格式的以下响应:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。使用操作 get 方法跟踪操作的状态:

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"

如果请求成功,服务器会返回 JSON 格式的以下响应:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

如需使用过滤条件文件导出 DICOM 数据,请发出 POST 请求并提供以下信息:

  • 父级数据集的名称和位置
  • DICOM 存储区的名称
  • 目标 Cloud Storage 存储分区
  • 过滤条件文件在 Cloud Storage 存储分区中的位置

以下示例展示了使用 Windows PowerShell 的 POST 请求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsDestination': {
      'uriPrefix': 'gs://BUCKET/DIRECTORY'
    },
    'filterConfig': {
      'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE'
  }" `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content

如果请求成功,则服务器会返回 JSON 格式的以下响应:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID"
}

响应包含操作名称。使用操作 get 方法跟踪操作的状态:

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME" | Select-Object -Expand Content

如果请求成功,服务器会以 JSON 格式返回包含操作状态的以下响应。

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

对 DICOM 导出请求进行问题排查

如果在执行 DICOM 导出请求期间发生错误,系统会将错误记录到 Cloud Logging。如需了解详情,请参阅在 Cloud Logging 中查看错误日志