使用 Cloud Storage 匯入及匯出 DICOM 資料

本頁說明如何將 DICOM 執行個體匯出至 Cloud Storage,以及從 Cloud Storage 匯入 DICOM 物件。DICOM 執行個體通常是圖片,但也可以是其他類型的永久資料,例如結構化報告。Cloud Storage 中的 DICOM 物件是位於 Cloud Storage 中的 DICOM 執行個體。詳情請參閱 Cloud Storage

設定 Cloud Storage 權限

如要將 DICOM 資料匯出及匯入 Cloud Storage,您必須先授予 Cloud Healthcare Service Agent 服務帳戶額外權限。詳情請參閱「DICOM 儲存庫 Cloud Storage 權限」。

匯入 DICOM 物件

如要將多個 DICOM 執行個體檔案匯入 DICOM 儲存庫,可以使用下列任一方法:

下列範例說明如何從 Cloud Storage 值區匯入 DICOM 物件。

控制台

如要從 Cloud Storage 值區匯入 DICOM 物件,請完成下列步驟:

  1. 在 Google Cloud 控制台中,前往「資料集」頁面。
    前往「資料集」
  2. 按一下包含 DICOM 儲存庫的資料集,您要將 DICOM 物件匯入該儲存庫。
  3. 在資料儲存庫清單中,從 DICOM 儲存庫的「Actions」(動作) 清單中選擇「Import」(匯入)

    系統會顯示「Import to DICOM store」(匯入 DICOM 儲存庫) 頁面。
  4. 在「Project」(專案) 清單中,選取 Cloud Storage 專案。
  5. 在「Location」(位置) 清單中,選取 Cloud Storage bucket。
  6. 如要設定匯入檔案的特定位置,請按照下列步驟操作:
    1. 展開「進階選項」
    2. 選取「覆寫 Cloud Storage 路徑」
    3. 如要設定匯入檔案的特定來源,請在「位置」文字方塊中使用下列變數定義路徑:
      • *:比對非分隔符字元。
      • ** - 比對字元,包括分隔符。 這項功能可搭配副檔名使用,比對所有相同類型的檔案。
      • ? - 比對 1 個字元。
  7. 按一下「匯入」,從定義的來源匯入 DICOM 物件。
  8. 如要追蹤作業狀態,請按一下「Operations」(作業) 分頁標籤。作業完成後,會顯示下列指標:
    • 「長時間執行的作業狀態」部分在「OK」標題下方顯示綠色勾號。
    • 「總覽」部分會顯示綠色勾號和「OK」指標,與作業 ID 位於同一列。
    如果發生任何錯誤,請按一下「動作」,然後點選「在 Cloud Logging 中查看詳細資料」

gcloud

如要從 Cloud Storage 值區匯入 DICOM 物件,請使用 gcloud healthcare dicom-stores import gcs 指令。指定父項資料集名稱、DICOM 儲存庫名稱,以及 Cloud Storage bucket 中的物件位置。

  • bucket 中的檔案位置是任意的,不一定要完全符合下列範例中指定的格式。
  • 在 Cloud Storage 中指定 DICOM 物件的位置時,可以使用萬用字元從一或多個目錄匯入多個檔案。 支援的萬用字元如下:
    • 使用 * 可比對 0 個以上的非分隔符字元。舉例來說,gs://BUCKET/DIRECTORY/Example*.dcm 會與 DIRECTORY 中的 Example.dcm 和 Example22.dcm 相符。
    • 使用 ** 來比對 0 個以上的字元 (包括分隔符)。必須用於路徑結尾,且路徑中不得有其他萬用字元。也可以搭配檔案副檔名 (例如 .dcm) 使用,匯入指定目錄及其子目錄中副檔名為該副檔名的所有檔案。舉例來說,gs://BUCKET/DIRECTORY/**.dcm 會匯入 DIRECTORY 及其子目錄中副檔名為「.dcm」的所有檔案。
    • 使用「?」比對 1 個字元。舉例來說,gs://BUCKET/DIRECTORY/Example?.dcm 會與 Example1.dcm 相符,但與 Example.dcm 或 Example01.dcm 不相符。

下列範例說明如何從 Cloud Storage 值區匯入 DICOM 物件。

gcloud healthcare dicom-stores import gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri=gs://BUCKET/DIRECTORY/DICOM_INSTANCE.dcm

指令列會顯示作業 ID:

name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID

如要查看作業狀態,請執行 gcloud healthcare operations describe 指令,並提供回應中的 OPERATION_ID

gcloud healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID

指令完成後,回應中會包含 done: true

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ImportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

如要從 Cloud Storage 值區匯入 DICOM 物件,請使用 projects.locations.datasets.dicomStores.import 方法。

  • 值區中的檔案位置可能不同,不一定要符合下列範例中指定的格式。
  • 在 Cloud Storage 中指定 DICOM 物件的位置時,請使用萬用字元從一或多個目錄匯入多個檔案。 支援的萬用字元如下:
    • 使用 * 可比對 0 個以上的非分隔符字元。舉例來說,gs://BUCKET/DIRECTORY/Example*.dcm 會與 DIRECTORY 中的 Example.dcm 和 Example22.dcm 相符。
    • 使用 ** 來比對 0 個以上的字元 (包括分隔符)。必須用於路徑結尾,且路徑中不得有其他萬用字元。也可以搭配檔案副檔名 (例如 .dcm) 使用,匯入指定目錄及其子目錄中副檔名為該副檔名的所有檔案。舉例來說,gs://BUCKET/DIRECTORY/**.dcm 會匯入 DIRECTORY 及其子目錄中副檔名為「.dcm」的所有檔案。
    • 使用「?」比對 1 個字元。舉例來說,gs://BUCKET/DIRECTORY/Example?.dcm 會與 Example1.dcm 相符,但與 Example.dcm 或 Example01.dcm 不相符。

REST

  1. 匯入 DICOM 物件。 <0x0A

    使用任何要求資料之前,請先替換以下項目:

    • PROJECT_ID:您的 Google Cloud 專案 ID
    • LOCATION:資料集位置
    • DATASET_ID:DICOM 儲存庫的父項資料集
    • DICOM_STORE_ID:DICOM 儲存庫 ID
    • BUCKET/PATH/TO/FILE:Cloud Storage 中 DICOM 物件的路徑

    JSON 要求主體:

    {
      "gcsSource": {
        "uri": "gs://BUCKET/PATH/TO/FILE.dcm"
      }
    }
    

    如要傳送要求,請選擇以下其中一個選項:

    curl

    將要求主體儲存在名為 request.json 的檔案中。 在終端機中執行下列指令,在目前目錄中建立或覆寫這個檔案:

    cat > request.json << 'EOF'
    {
      "gcsSource": {
        "uri": "gs://BUCKET/PATH/TO/FILE.dcm"
      }
    }
    EOF

    接著,請執行下列指令來傳送 REST 要求:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    -d @request.json \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"

    PowerShell

    將要求主體儲存在名為 request.json 的檔案中。 在終端機中執行下列指令,在目前目錄中建立或覆寫這個檔案:

    @'
    {
      "gcsSource": {
        "uri": "gs://BUCKET/PATH/TO/FILE.dcm"
      }
    }
    '@  | Out-File -FilePath request.json -Encoding utf8

    接著,請執行下列指令來傳送 REST 要求:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json" `
    -InFile request.json `
    -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content
    輸出內容如下。回應會包含長時間執行作業的 ID。 如果方法呼叫可能需要大量時間才能完成,系統就會傳回長時間執行的作業。請記下 OPERATION_ID 的值。下一個步驟會用到這個值。

  2. 取得長時間執行作業的狀態。 <0x0

    使用任何要求資料之前,請先替換以下項目:

    • PROJECT_ID:您的 Google Cloud 專案 ID
    • LOCATION:資料集位置
    • DATASET_ID:DICOM 儲存庫的父項資料集
    • OPERATION_ID:長時間執行的作業傳回的 ID

    如要傳送要求,請選擇以下其中一個選項:

    curl

    執行下列指令:

    curl -X GET \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

    PowerShell

    執行下列指令:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method GET `
    -Headers $headers `
    -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
    如果長時間執行的作業仍在執行,伺服器會傳回回應,其中包含待匯入的 DICOM 執行個體數量。如果 LRO 順利完成,伺服器會以 JSON 格式傳回作業狀態的回應:

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1"
)

// importDICOMInstance imports DICOM objects from GCS.
func importDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, contentURI string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %w", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ImportDicomDataRequest{
		GcsSource: &healthcare.GoogleCloudHealthcareV1DicomGcsSource{
			Uri: contentURI,
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Import(name, req).Do()
	if err != nil {
		return fmt.Errorf("Import: %w", err)
	}

	fmt.Fprintf(w, "Import to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.gson.GsonFactory;
import com.google.api.services.healthcare.v1.CloudHealthcare;
import com.google.api.services.healthcare.v1.CloudHealthcare.Projects.Locations.Datasets.DicomStores;
import com.google.api.services.healthcare.v1.CloudHealthcareScopes;
import com.google.api.services.healthcare.v1.model.GoogleCloudHealthcareV1DicomGcsSource;
import com.google.api.services.healthcare.v1.model.ImportDicomDataRequest;
import com.google.api.services.healthcare.v1.model.Operation;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Collections;

public class DicomStoreImport {
  private static final String DICOM_NAME = "projects/%s/locations/%s/datasets/%s/dicomStores/%s";
  private static final JsonFactory JSON_FACTORY = new GsonFactory();
  private static final NetHttpTransport HTTP_TRANSPORT = new NetHttpTransport();

  public static void dicomStoreImport(String dicomStoreName, String gcsUri) throws IOException {
    // String dicomStoreName =
    //    String.format(
    //        DICOM_NAME, "your-project-id", "your-region-id", "your-dataset-id", "your-dicom-id");
    // String gcsUri = "gs://your-bucket-id/path/to/destination/dir"

    // Initialize the client, which will be used to interact with the service.
    CloudHealthcare client = createClient();

    // Configure where the store should be imported from.
    GoogleCloudHealthcareV1DicomGcsSource gcsSource =
        new GoogleCloudHealthcareV1DicomGcsSource().setUri(gcsUri);
    ImportDicomDataRequest importRequest = new ImportDicomDataRequest().setGcsSource(gcsSource);

    // Create request and configure any parameters.
    DicomStores.CloudHealthcareImport request =
        client
            .projects()
            .locations()
            .datasets()
            .dicomStores()
            .healthcareImport(dicomStoreName, importRequest);

    // Execute the request, wait for the operation to complete, and process the results.
    try {
      Operation operation = request.execute();
      while (operation.getDone() == null || !operation.getDone()) {
        // Update the status of the operation with another request.
        Thread.sleep(500); // Pause for 500ms between requests.
        operation =
            client
                .projects()
                .locations()
                .datasets()
                .operations()
                .get(operation.getName())
                .execute();
      }
      System.out.println("DICOM store import complete." + operation.getResponse());
    } catch (Exception ex) {
      System.out.printf("Error during request execution: %s", ex.toString());
      ex.printStackTrace(System.out);
    }
  }

  private static CloudHealthcare createClient() throws IOException {
    // Use Application Default Credentials (ADC) to authenticate the requests
    // For more information see https://cloud.google.com/docs/authentication/production
    GoogleCredentials credential =
        GoogleCredentials.getApplicationDefault()
            .createScoped(Collections.singleton(CloudHealthcareScopes.CLOUD_PLATFORM));

    // Create a HttpRequestInitializer, which will provide a baseline configuration to all requests.
    HttpRequestInitializer requestInitializer =
        request -> {
          new HttpCredentialsAdapter(credential).initialize(request);
          request.setConnectTimeout(60000); // 1 minute connect timeout
          request.setReadTimeout(60000); // 1 minute read timeout
        };

    // Build the client for interacting with the service.
    return new CloudHealthcare.Builder(HTTP_TRANSPORT, JSON_FACTORY, requestInitializer)
        .setApplicationName("your-application-name")
        .build();
  }
}

Node.js

const google = require('@googleapis/healthcare');
const healthcare = google.healthcare({
  version: 'v1',
  auth: new google.auth.GoogleAuth({
    scopes: ['https://www.googleapis.com/auth/cloud-platform'],
  }),
});
const sleep = ms => {
  return new Promise(resolve => setTimeout(resolve, ms));
};

const importDicomInstance = async () => {
  // TODO(developer): uncomment these lines before running the sample
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket/my-directory/*.dcm'
  const name = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;
  const request = {
    name,
    resource: {
      // The location of the DICOM instances in Cloud Storage
      gcsSource: {
        uri: `gs://${gcsUri}`,
      },
    },
  };

  const operation =
    await healthcare.projects.locations.datasets.dicomStores.import(request);
  const operationName = operation.data.name;

  const operationRequest = {name: operationName};

  // Wait fifteen seconds for the LRO to finish.
  await sleep(15000);

  // Check the LRO's status
  const operationStatus =
    await healthcare.projects.locations.datasets.operations.get(
      operationRequest
    );

  const {data} = operationStatus;

  if (data.error === undefined) {
    console.log('Successfully imported DICOM instances');
  } else {
    console.log('Encountered errors. Sample error:');
    console.log(
      'Resource on which error occured:',
      data.error.details[0]['sampleErrors'][0]['resource']
    );
    console.log(
      'Error code:',
      data.error.details[0]['sampleErrors'][0]['error']['code']
    );
    console.log(
      'Error message:',
      data.error.details[0]['sampleErrors'][0]['error']['message']
    );
  }
};

importDicomInstance();

Python

def import_dicom_instance(
    project_id, location, dataset_id, dicom_store_id, content_uri
):
    """Imports data into the DICOM store by copying it from the specified
    source.

    See https://github.com/GoogleCloudPlatform/python-docs-samples/tree/main/healthcare/api-client/v1/dicom
    before running the sample."""
    # Imports the Google API Discovery Service.
    from googleapiclient import discovery

    api_version = "v1"
    service_name = "healthcare"
    # Returns an authorized API client by discovering the Healthcare API
    # and using GOOGLE_APPLICATION_CREDENTIALS environment variable.
    client = discovery.build(service_name, api_version)

    # TODO(developer): Uncomment these lines and replace with your values.
    # project_id = 'my-project'  # replace with your GCP project ID
    # location = 'us-central1'  # replace with the parent dataset's location
    # dataset_id = 'my-dataset'  # replace with the DICOM store's parent dataset ID
    # dicom_store_id = 'my-dicom-store'  # replace with the DICOM store's ID
    # content_uri = 'my-bucket/*.dcm'  # replace with a Cloud Storage bucket and DCM files
    dicom_store_parent = "projects/{}/locations/{}/datasets/{}".format(
        project_id, location, dataset_id
    )
    dicom_store_name = f"{dicom_store_parent}/dicomStores/{dicom_store_id}"

    body = {"gcsSource": {"uri": f"gs://{content_uri}"}}

    # Escape "import()" method keyword because "import"
    # is a reserved keyword in Python
    request = (
        client.projects()
        .locations()
        .datasets()
        .dicomStores()
        .import_(name=dicom_store_name, body=body)
    )

    response = request.execute()
    print(f"Imported DICOM instance: {content_uri}")

    return response

如要從 DICOM 存放區擷取單一執行個體或研究,請使用 Cloud Healthcare API 實作的 Retrieve Transaction RESTful Web 服務,擷取 DICOM 資料

指定儲存空間級別來匯入 DICOM 物件 (預先發布版)

根據預設,projects.locations.datasets.dicomStores.import 方法會將 DICOM 物件匯入 DICOM 儲存庫,並使用標準儲存空間類別。從 Cloud Storage 匯入 DICOM 物件時,可以設定儲存空間類別。詳情請參閱「變更 DICOM 儲存空間級別」。

下列範例說明從 Cloud Storage 匯入 DICOM 物件時,如何指定儲存空間類別。

REST

請使用 projects.locations.datasets.dicomStores.import 方法。

  1. 匯入 DICOM 物件。 <0x

    使用任何要求資料之前,請先替換以下項目:

    • PROJECT_ID:您的 Google Cloud 專案 ID
    • LOCATION:資料集位置
    • DATASET_ID:DICOM 儲存庫的父項資料集
    • DICOM_STORE_ID:DICOM 儲存庫 ID
    • BUCKET/PATH/TO/FILE:Cloud Storage 中 DICOM 物件的路徑
    • STORAGE_CLASS:DICOM 儲存庫中 DICOM 物件的儲存空間類別,來自 STANDARDNEARLINECOLDLINEARCHIVE

    JSON 要求主體:

    {
      "gcsSource": {
        "uri": "gs://BUCKET/PATH/TO/FILE.dcm"
      },
      "blob_storage_settings": {
        "blob_storage_class": "STORAGE_CLASS"
      }
    }
    

    如要傳送要求,請選擇以下其中一個選項:

    curl

    將要求主體儲存在名為 request.json 的檔案中。 在終端機中執行下列指令,在目前目錄中建立或覆寫這個檔案:

    cat > request.json << 'EOF'
    {
      "gcsSource": {
        "uri": "gs://BUCKET/PATH/TO/FILE.dcm"
      },
      "blob_storage_settings": {
        "blob_storage_class": "STORAGE_CLASS"
      }
    }
    EOF

    接著,請執行下列指令來傳送 REST 要求:

    curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
    -d @request.json \
    "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import"

    PowerShell

    將要求主體儲存在名為 request.json 的檔案中。 在終端機中執行下列指令,在目前目錄中建立或覆寫這個檔案:

    @'
    {
      "gcsSource": {
        "uri": "gs://BUCKET/PATH/TO/FILE.dcm"
      },
      "blob_storage_settings": {
        "blob_storage_class": "STORAGE_CLASS"
      }
    }
    '@  | Out-File -FilePath request.json -Encoding utf8

    接著,請執行下列指令來傳送 REST 要求:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method POST `
    -Headers $headers `
    -ContentType: "application/json" `
    -InFile request.json `
    -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:import" | Select-Object -Expand Content
    輸出內容如下。回應會包含長時間執行作業的 ID。 如果方法呼叫可能需要大量時間才能完成,系統就會傳回長時間執行的作業。請記下 OPERATION_ID 的值。下一個步驟會用到這個值。

  2. 取得長時間執行作業的狀態。 <0x0A

    使用任何要求資料之前,請先替換以下項目:

    • PROJECT_ID:您的 Google Cloud 專案 ID
    • LOCATION:資料集位置
    • DATASET_ID:DICOM 儲存庫的父項資料集
    • OPERATION_ID:長時間執行的作業傳回的 ID

    如要傳送要求,請選擇以下其中一個選項:

    curl

    執行下列指令:

    curl -X GET \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

    PowerShell

    執行下列指令:

    $cred = gcloud auth print-access-token
    $headers = @{ "Authorization" = "Bearer $cred" }

    Invoke-WebRequest `
    -Method GET `
    -Headers $headers `
    -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
    如果長時間執行的作業仍在執行,伺服器會傳回回應,其中包含待匯入的 DICOM 執行個體數量。LRO 完成後,伺服器會傳回 JSON 格式的回應,其中包含作業狀態:

排解 DICOM 匯入要求問題

如果 DICOM 匯入要求發生錯誤,系統會將錯誤記錄到 Cloud Logging。詳情請參閱「查看 Cloud Logging 中的錯誤記錄檔」。

匯出 DICOM 執行個體

下列範例說明如何將 DICOM 執行個體匯出至 Cloud Storage bucket。從 DICOM 儲存庫匯出 DICOM 執行個體時,系統會匯出儲存庫中的所有執行個體。

控制台

如要將 DICOM 執行個體匯出至 Cloud Storage,請完成下列步驟:

  1. 在 Google Cloud 控制台中,前往「資料集」頁面。
    前往「資料集」
  2. 按一下包含 DICOM 儲存庫的資料集,您要從該儲存庫匯出 DICOM 執行個體。
  3. 在資料儲存庫清單中,從 DICOM 儲存庫的「動作」清單中選擇「匯出」。
  4. 在隨即顯示的「Export DICOM Store」(匯出 DICOM 儲存庫) 頁面中,選取「Google Cloud Storage Bucket」(Google Cloud Storage 值區)
  5. 在「Project」(專案) 清單中,選取 Cloud Storage 專案。
  6. 在「Location」(位置) 清單中,選取 Cloud Storage bucket。
  7. 在「DICOM Export Settings」(DICOM 匯出設定) 中,選取用於匯出 DICOM 執行個體的檔案類型。可用的類型如下:
    • DICOM 檔案 (.dcm)
    • octet-stream
    • 圖片 (.jpg.png)
  8. 如要定義其他傳輸語法,請從「傳輸語法」清單中選擇語法。
  9. 按一下「匯出」,將 DICOM 執行個體匯出至 Cloud Storage 中定義的位置。
  10. 如要追蹤作業狀態,請按一下「Operations」(作業) 分頁標籤。作業完成後,會顯示下列指標:
    • 「長時間執行的作業狀態」部分在「OK」標題下方顯示綠色勾號。
    • 「總覽」部分會顯示綠色勾號和「OK」指標,與作業 ID 位於同一列。
    如果發生任何錯誤,請按一下「動作」,然後點選「在 Cloud Logging 中查看詳細資料」

gcloud

如要將 DICOM 執行個體匯出至 Cloud Storage 值區,請使用 gcloud healthcare dicom-stores export gcs 指令。

  • 提供父項資料集名稱、DICOM 存放區名稱和目的地 Cloud Storage bucket。
  • 請寫入 Cloud Storage bucket 或目錄,而非物件,因為 Cloud Healthcare API 會為每個物件建立一個 .dcm 檔案。
  • 如果指令指定的目錄不存在,系統會建立該目錄。

下列範例顯示 gcloud healthcare dicom-stores export gcs 指令。

gcloud healthcare dicom-stores export gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri-prefix=gs://BUCKET/DIRECTORY

指令列會顯示作業 ID:

name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID

如要查看作業狀態,請執行 gcloud healthcare operations describe 指令,並提供回應中的 OPERATION_ID

gcloud healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID

指令完成後,回應中會包含 done: true

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: "CREATE_TIME"
endTime: "END_TIME"
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': "..."

API

如要將 DICOM 執行個體匯出至 Cloud Storage 值區,請使用 projects.locations.datasets.dicomStores.export 方法。

  • 請寫入 Cloud Storage bucket 或目錄,而非物件,因為 Cloud Healthcare API 會為每個 DICOM 物件建立一個 .dcm 檔案。
  • 如果指令指定的目錄不存在,系統會建立該目錄。

curl

如要匯出 DICOM 執行個體,請提出 POST 要求並提供下列資訊:

  • 父項資料集的名稱和位置
  • DICOM 儲存庫名稱
  • 目標 Cloud Storage 值區

以下範例顯示如何使用 curl 發出 POST 要求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsDestination': {
        'uriPrefix': 'gs://BUCKET/DIRECTORY'
      }
    }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"

如果要求成功,伺服器會以 JSON 格式傳回回應:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

回應會包含作業名稱。如要追蹤作業狀態,請使用作業 get 方法

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"

如果要求成功,伺服器會以 JSON 格式傳回作業狀態的回應:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    }
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

如要匯出 DICOM 執行個體,請提出 POST 要求並提供下列資訊:

  • 父項資料集的名稱和位置
  • DICOM 儲存庫名稱
  • 目標 Cloud Storage 值區

下列範例顯示如何使用 Windows PowerShell 提出 POST 要求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsDestination': {
      'uriPrefix': 'gs://BUCKET/DIRECTORY'
    }
  }" `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content

如果要求成功,伺服器會以 JSON 格式傳回回應:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
}

回應會包含作業名稱。如要追蹤作業狀態,請使用作業 get 方法

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content

如果要求成功,伺服器會以 JSON 格式傳回作業狀態的回應:

{
  "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME",
    "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL",
    "counter":{
       "success": SUCCESSFUL_INSTANCES
       "failure": FAILED_INSTANCES
    },
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

Go

import (
	"context"
	"fmt"
	"io"

	healthcare "google.golang.org/api/healthcare/v1"
)

// exportDICOMInstance exports DICOM objects to GCS.
func exportDICOMInstance(w io.Writer, projectID, location, datasetID, dicomStoreID, destination string) error {
	ctx := context.Background()

	healthcareService, err := healthcare.NewService(ctx)
	if err != nil {
		return fmt.Errorf("healthcare.NewService: %w", err)
	}

	storesService := healthcareService.Projects.Locations.Datasets.DicomStores

	req := &healthcare.ExportDicomDataRequest{
		GcsDestination: &healthcare.GoogleCloudHealthcareV1DicomGcsDestination{
			UriPrefix: destination, // "gs://my-bucket/path/to/prefix/"
		},
	}
	name := fmt.Sprintf("projects/%s/locations/%s/datasets/%s/dicomStores/%s", projectID, location, datasetID, dicomStoreID)

	lro, err := storesService.Export(name, req).Do()
	if err != nil {
		return fmt.Errorf("Export: %w", err)
	}

	fmt.Fprintf(w, "Export to DICOM store started. Operation: %q\n", lro.Name)
	return nil
}

Java

import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.gson.GsonFactory;
import com.google.api.services.healthcare.v1.CloudHealthcare;
import com.google.api.services.healthcare.v1.CloudHealthcare.Projects.Locations.Datasets.DicomStores;
import com.google.api.services.healthcare.v1.CloudHealthcareScopes;
import com.google.api.services.healthcare.v1.model.ExportDicomDataRequest;
import com.google.api.services.healthcare.v1.model.GoogleCloudHealthcareV1DicomGcsDestination;
import com.google.api.services.healthcare.v1.model.Operation;
import com.google.auth.http.HttpCredentialsAdapter;
import com.google.auth.oauth2.GoogleCredentials;
import java.io.IOException;
import java.util.Collections;

public class DicomStoreExport {
  private static final String DICOM_NAME = "projects/%s/locations/%s/datasets/%s/dicomStores/%s";
  private static final JsonFactory JSON_FACTORY = new GsonFactory();
  private static final NetHttpTransport HTTP_TRANSPORT = new NetHttpTransport();

  public static void dicomStoreExport(String dicomStoreName, String gcsUri) throws IOException {
    // String dicomStoreName =
    //    String.format(
    //        DICOM_NAME, "your-project-id", "your-region-id", "your-dataset-id", "your-dicom-id");
    // String gcsUri = "gs://your-bucket-id/path/to/destination/dir"

    // Initialize the client, which will be used to interact with the service.
    CloudHealthcare client = createClient();

    // Configure where the store will be exported too.
    GoogleCloudHealthcareV1DicomGcsDestination gcsDestination =
        new GoogleCloudHealthcareV1DicomGcsDestination().setUriPrefix(gcsUri);
    ExportDicomDataRequest exportRequest =
        new ExportDicomDataRequest().setGcsDestination(gcsDestination);

    // Create request and configure any parameters.
    DicomStores.Export request =
        client
            .projects()
            .locations()
            .datasets()
            .dicomStores()
            .export(dicomStoreName, exportRequest);

    // Execute the request, wait for the operation to complete, and process the results.
    try {
      Operation operation = request.execute();
      while (operation.getDone() == null || !operation.getDone()) {
        // Update the status of the operation with another request.
        Thread.sleep(500); // Pause for 500ms between requests.
        operation =
            client
                .projects()
                .locations()
                .datasets()
                .operations()
                .get(operation.getName())
                .execute();
      }
      System.out.println("DICOM store export complete." + operation.getResponse());
    } catch (Exception ex) {
      System.out.printf("Error during request execution: %s", ex.toString());
      ex.printStackTrace(System.out);
    }
  }

  private static CloudHealthcare createClient() throws IOException {
    // Use Application Default Credentials (ADC) to authenticate the requests
    // For more information see https://cloud.google.com/docs/authentication/production
    GoogleCredentials credential =
        GoogleCredentials.getApplicationDefault()
            .createScoped(Collections.singleton(CloudHealthcareScopes.CLOUD_PLATFORM));

    // Create a HttpRequestInitializer, which will provide a baseline configuration to all requests.
    HttpRequestInitializer requestInitializer =
        request -> {
          new HttpCredentialsAdapter(credential).initialize(request);
          request.setConnectTimeout(60000); // 1 minute connect timeout
          request.setReadTimeout(60000); // 1 minute read timeout
        };

    // Build the client for interacting with the service.
    return new CloudHealthcare.Builder(HTTP_TRANSPORT, JSON_FACTORY, requestInitializer)
        .setApplicationName("your-application-name")
        .build();
  }
}

Node.js

const google = require('@googleapis/healthcare');
const healthcare = google.healthcare({
  version: 'v1',
  auth: new google.auth.GoogleAuth({
    scopes: ['https://www.googleapis.com/auth/cloud-platform'],
  }),
});

const exportDicomInstanceGcs = async () => {
  // TODO(developer): uncomment these lines before running the sample
  // const cloudRegion = 'us-central1';
  // const projectId = 'adjective-noun-123';
  // const datasetId = 'my-dataset';
  // const dicomStoreId = 'my-dicom-store';
  // const gcsUri = 'my-bucket/my-directory'
  const name = `projects/${projectId}/locations/${cloudRegion}/datasets/${datasetId}/dicomStores/${dicomStoreId}`;
  const request = {
    name,
    resource: {
      gcsDestination: {
        // The destination location of the DICOM instances in Cloud Storage
        uriPrefix: `gs://${gcsUri}`,
        // The format to use for the output files, per the MIME types supported in the DICOM spec
        mimeType: 'application/dicom',
      },
    },
  };

  await healthcare.projects.locations.datasets.dicomStores.export(request);
  console.log(`Exported DICOM instances to ${gcsUri}`);
};

exportDicomInstanceGcs();

Python

def export_dicom_instance(project_id, location, dataset_id, dicom_store_id, uri_prefix):
    """Export data to a Google Cloud Storage bucket by copying
    it from the DICOM store.

    See https://github.com/GoogleCloudPlatform/python-docs-samples/tree/main/healthcare/api-client/v1/dicom
    before running the sample."""
    # Imports the Google API Discovery Service.
    from googleapiclient import discovery

    api_version = "v1"
    service_name = "healthcare"
    # Returns an authorized API client by discovering the Healthcare API
    # and using GOOGLE_APPLICATION_CREDENTIALS environment variable.
    client = discovery.build(service_name, api_version)

    # TODO(developer): Uncomment these lines and replace with your values.
    # project_id = 'my-project'  # replace with your GCP project ID
    # location = 'us-central1'  # replace with the parent dataset's location
    # dataset_id = 'my-dataset'  # replace with the DICOM store's parent dataset ID
    # dicom_store_id = 'my-dicom-store'  # replace with the DICOM store's ID
    # uri_prefix = 'my-bucket'  # replace with a Cloud Storage bucket
    dicom_store_parent = "projects/{}/locations/{}/datasets/{}".format(
        project_id, location, dataset_id
    )
    dicom_store_name = f"{dicom_store_parent}/dicomStores/{dicom_store_id}"

    body = {"gcsDestination": {"uriPrefix": f"gs://{uri_prefix}"}}

    request = (
        client.projects()
        .locations()
        .datasets()
        .dicomStores()
        .export(name=dicom_store_name, body=body)
    )

    response = request.execute()
    print(f"Exported DICOM instances to bucket: gs://{uri_prefix}")

    return response

使用篩選器匯出 DICOM 執行個體

根據預設,將 DICOM 檔案匯出至 Cloud Storage 時,系統會匯出 DICOM 儲存庫中的所有 DICOM 檔案。同樣地,將 DICOM 中繼資料匯出至 BigQuery 時,系統會匯出 DICOM 儲存庫中所有 DICOM 資料的中繼資料。

您可以使用篩選器檔案匯出部分 DICOM 資料或中繼資料。

設定篩選器檔案

  • 篩選器檔案中的每一行都會定義研究、系列或執行個體,並使用 /studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID/instances/INSTANCE_UID 格式。
  • 您可以截斷行,指定篩選器運作的層級。 舉例來說,您可以指定 /studies/STUDY_INSTANCE_UID 來選取整個研究,也可以指定 /studies/STUDY_INSTANCE_UID/series/SERIES_INSTANCE_UID 來選取整個系列。

請參考下列篩選器檔案:

/studies/1.123.456.789
/studies/1.666.333.111/series/123.456
/studies/1.666.333.111/series/567.890
/studies/1.888.999.222/series/123.456/instances/111
/studies/1.888.999.222/series/123.456/instances/222
/studies/1.888.999.222/series/123.456/instances/333

這個篩選器檔案範例適用於下列項目:

  • 整個研究,其中研究執行個體 UID 為 1.123.456.789
  • 兩個不同的系列,且系列執行個體 UID 在研究 1.666.333.111 中分別為 123.456567.890
  • 研究 1.888.999.222 和系列 123.456 中有三個執行個體,執行個體 ID 分別為 111222333

使用 BigQuery 建立篩選器檔案

如要使用 BigQuery 建立篩選器檔案,請先將 DICOM 儲存庫的中繼資料匯出至 BigQuery。匯出的中繼資料會顯示 DICOM 儲存庫中 DICOM 資料的研究、系列和執行個體 UID。

匯出中繼資料後,請完成下列步驟:

  1. 執行查詢,傳回要新增至篩選器檔案的研究、系列和執行個體 UID。

    舉例來說,下列查詢顯示如何串連研究、系列和例項 UID,以符合篩選器檔案格式規定:

    SELECT CONCAT
        ('/studies/', StudyInstanceUID, '/series/', SeriesInstanceUID, '/instances/', SOPInstanceUID)
    FROM
        [PROJECT_ID:BIGQUERY_DATASET.BIGQUERY_TABLE]
  2. 選用:如果查詢傳回的結果集過大,超出回應大小上限,請將查詢結果儲存至 BigQuery 中的新目的地資料表

  3. 將查詢結果儲存至檔案,然後匯出至 Cloud Storage。如果您在步驟 2 中將查詢結果儲存至新的目的地資料表,請參閱「匯出資料表資料」一文,瞭解如何將資料表內容匯出至 Cloud Storage。

  4. 視需要編輯匯出的檔案,並將其納入變更多個 DICOM 物件儲存空間類別的要求中。

手動建立篩選器檔案

如要手動建立篩選器檔案,請執行下列操作:

  1. 建立篩選器檔案,內含要篩選的 DICOM 物件。
  2. 將篩選器檔案上傳至 Cloud Storage。如需操作說明,請參閱從檔案系統上傳物件

傳遞篩選器檔案

建立篩選器檔案後,請呼叫 DICOM 匯出作業,並使用 REST API 傳遞篩選器檔案。下列範例說明如何使用篩選器匯出 DICOM 資料。

gcloud

如要使用篩選器將 DICOM 中繼資料匯出至 Cloud Storage,請使用 gcloud beta healthcare dicom-stores export gcs 指令:

gcloud beta healthcare dicom-stores export gcs DICOM_STORE_ID \
  --dataset=DATASET_ID \
  --location=LOCATION \
  --gcs-uri-prefix=gs://DESTINATION_BUCKET/DIRECTORY \
  --filter-config-gcs-uri=gs://BUCKET/DIRECTORY/FILTER_FILE

更改下列內容:

  • DICOM_STORE_ID:DICOM 儲存庫的 ID
  • DATASET_ID:DICOM 儲存庫父項資料集的名稱
  • LOCATION:DICOM 儲存庫父項資料集的位置
  • DESTINATION_BUCKET/DIRECTORY:目的地 Cloud Storage bucket
  • BUCKET/DIRECTORY/FILTER_FILE:Cloud Storage 值區中篩選器檔案的位置

輸出內容如下:

Request issued for: [DICOM_STORE_ID]
Waiting for operation [projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID] to complete...done.
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID

如要查看作業狀態,請執行 gcloud healthcare operations describe 指令,並提供回應中的 OPERATION_ID

gcloud healthcare operations describe OPERATION_ID \
  --location=LOCATION \
  --dataset=DATASET_ID

更改下列內容:

  • OPERATION_ID:先前回應傳回的 ID 編號
  • DATASET_ID:DICOM 儲存庫父項資料集的名稱
  • LOCATION:DICOM 儲存庫父項資料集的位置

輸出內容如下:

done: true
metadata:
'@type': type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata
apiMethodName: google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData
counter:
  success: SUCCESSFUL_INSTANCES
  failure: FAILED_INSTANCES
createTime: 'CREATE_TIME'
endTime: 'END_TIME'
logsUrl: 'https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL'
name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID
response:
'@type': '...'

API

如要使用篩選器匯出 DICOM 資料,請使用 projects.locations.datasets.dicomStores.export 方法。

curl

如要使用篩選器檔案匯出 DICOM 資料,請發出 POST 要求並提供下列資訊:

  • 父項資料集的名稱和位置
  • DICOM 儲存庫名稱
  • 目標 Cloud Storage 值區
  • 篩選器檔案在 Cloud Storage 值區中的位置

以下範例顯示如何使用 curl 發出 POST 要求。

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
    --data "{
      'gcsDestination': {
        'uriPrefix': 'gs://BUCKET/DIRECTORY'
      },
      'filterConfig': {
        'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE'
      }
    }" "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export"

如果要求成功,伺服器會以 JSON 格式傳回下列回應:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID"
}

回應會包含作業名稱。使用 Operation get 方法追蹤作業狀態:

curl -X GET \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME"

如果要求成功,伺服器會以 JSON 格式傳回下列回應:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

PowerShell

如要使用篩選器檔案匯出 DICOM 資料,請發出 POST 要求並提供下列資訊:

  • 父項資料集的名稱和位置
  • DICOM 儲存庫名稱
  • 目標 Cloud Storage 值區
  • 篩選器檔案在 Cloud Storage 值區中的位置

下列範例顯示如何使用 Windows PowerShell 提出 POST 要求。

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Post `
  -Headers $headers `
  -ContentType: "application/json; charset=utf-8" `
  -Body "{
    'gcsDestination': {
      'uriPrefix': 'gs://BUCKET/DIRECTORY'
    },
    'filterConfig': {
      'resourcePathsGcsUri': 'gs://BUCKET/DIRECTORY/FILTER_FILE'
  }" `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/dicomStores/DICOM_STORE_ID:export" | Select-Object -Expand Content

如果要求成功,伺服器會以 JSON 格式傳回下列回應:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID"
}

回應會包含作業名稱。使用 Operation get 方法追蹤作業狀態:

$cred = gcloud auth application-default print-access-token
$headers = @{ Authorization = "Bearer $cred" }

Invoke-WebRequest `
  -Method Get `
  -Headers $headers `
  -Uri "https://healthcare.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_NAME" | Select-Object -Expand Content

如果要求成功,伺服器會傳回下列回應,其中包含 JSON 格式的作業狀態:

{
  "name": "projects/PROJECT_ID/locations/REGION/datasets/DATASET_ID/operations/OPERATION_ID",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.healthcare.v1beta1.OperationMetadata",
    "apiMethodName": "google.cloud.healthcare.v1beta1.dicom.DicomService.ExportDicomData",
    "createTime": "CREATE_TIME",
    "endTime": "END_TIME"
  },
  "done": true,
  "response": {
    "@type": "..."
  }
}

排解 DICOM 匯出要求問題

如果 DICOM 匯出要求發生錯誤,系統會將錯誤記錄到 Cloud Logging。詳情請參閱「查看 Cloud Logging 中的錯誤記錄檔」。

如果整個作業傳回錯誤,請參閱「排解長期執行的作業問題」。