Method: dicomStores.export

Full name: projects.locations.datasets.dicomStores.export

Exports data to the specified destination by copying it from the DICOM store. The metadata field type is OperationMetadata.

HTTP request

POST https://healthcare.googleapis.com/v1alpha2/{name=projects/*/locations/*/datasets/*/dicomStores/*}:export

The URL uses gRPC Transcoding syntax.

Path parameters

Parameters
name

string

The DICOM store resource name from which the data should be exported (e.g., projects/{projectId}/locations/{locationId}/datasets/{datasetId}/dicomStores/{dicomStoreId}).

Authorization requires the following Google IAM permission on the specified resource name:

  • healthcare.dicomStores.export

Request body

The request body contains data with the following structure:

JSON representation
{

  // Union field destination can be only one of the following:
  "gcsDestination": {
    object(GcsDestination)
  },
  "bigqueryDestination": {
    object(BigQueryDestination)
  }
  // End of list of possible types for union field destination.
}
Fields

Union field destination. Specifies the destination of the output.

To enable the Cloud Healthcare API to write to resources in your project (e.g., Cloud Storage buckets), you must give the consumer Cloud Healthcare API service account the proper permissions. The service account is: service-{PROJECT_NUMBER}@gcp-sa-healthcare.iam.gserviceaccount.com. The PROJECT_NUMBER identifies the project that contains the destination Cloud Storage bucket. To get the project number, go to the GCP Console Dashboard. destination can be only one of the following:

gcsDestination

object(GcsDestination)

The Cloud Storage output destination.

The Cloud Storage location requires the roles/storage.objectAdmin Cloud IAM role.

bigqueryDestination

object(BigQueryDestination)

The BigQuery output destination.

You can only export to a BigQuery dataset that's in the same project as the DICOM store you're exporting from.

The BigQuery location requires two IAM roles: roles/bigquery.dataEditor and roles/bigquery.jobUser.

Response body

If successful, the response body contains an instance of Operation.

Authorization Scopes

Requires one of the following OAuth scopes:

  • https://www.googleapis.com/auth/cloud-healthcare
  • https://www.googleapis.com/auth/cloud-platform

For more information, see the Authentication Overview.

GcsDestination

The Cloud Storage location where the output should be written, and the export configuration.

JSON representation
{
  "uriPrefix": string,
  "mimeType": string
}
Fields
uriPrefix

string

The Cloud Storage destination to export to.

URI for a Cloud Storage directory where result files should be written (in the format gs://{bucket-id}/{path/to/destination/dir}). If there is no trailing slash, the service will append one when composing the object path. The user is responsible for creating the Cloud Storage bucket referenced in uriPrefix.

mimeType

string

MIME types supported by DICOM spec. Each file will be written in the following format: .../{study_id}/{series_id}/{instance_id}[/{frame_number}].{extension} The frame_number component will exist only for multi-frame instances.

Refer to the DICOM conformance statement for permissible MIME types: https://cloud.google.com/healthcare/docs/dicom#wado-rs

The following extensions will be used for output files: application/dicom -> .dcm image/jpeg -> .jpg image/png -> .png

If unspecified, the instances will be exported in their original DICOM format.

BigQueryDestination

The BigQuery table where the output should be written.

JSON representation
{
  "tableUri": string,
  "force": boolean
}
Fields
tableUri

string

BigQuery URI to a table, up to 2000 characters long, in the format bq://projectId.bqDatasetId.tableId

force

boolean

If the destination table already exists and this flag is TRUE, the table will be overwritten by the contents of the DICOM store. If the flag is not set and the destination table already exists, the export call returns an error.

หน้านี้มีประโยชน์ไหม โปรดแสดงความคิดเห็น