Method: fhirStores.export

Full name: projects.locations.datasets.fhirStores.export

Export resources from the FHIR store to the specified destination.

This method returns an Operation that can be used to track the status of the export by calling operations.get.

Immediate fatal errors appear in the error field, errors are also logged to Cloud Logging (see Viewing logs). Otherwise, when the operation finishes, a detailed response of type ExportResourcesResponse is returned in the response field. The metadata field type for this operation is OperationMetadata.

HTTP request


The URL uses gRPC Transcoding syntax.

Path parameters



The name of the FHIR store to export resource from, in the format of projects/{projectId}/locations/{locationId}/datasets/{datasetId}/fhirStores/{fhirStoreId}.

Authorization requires the following IAM permission on the specified resource name:

  • healthcare.fhirStores.export

Request body

The request body contains data with the following structure:

JSON representation

  // Union field destination can be only one of the following:
  "gcsDestination": {
  "bigqueryDestination": {
  // End of list of possible types for union field destination.

Union field destination. The output destination of the export.

To enable the Cloud Healthcare API to write to resources in your project such as Cloud Storage buckets, you must give the consumer Cloud Healthcare API service account the proper permissions. The service account is: service-{PROJECT_NUMBER} The PROJECT_NUMBER identifies the project that contains the source FHIR store. To get the project number, go to the Cloud Console Dashboard. destination can be only one of the following:



The Cloud Storage output destination.

The Cloud Storage location requires the roles/storage.objectAdmin Cloud IAM role.

The exported outputs are organized by FHIR resource types. The server creates one object per resource type. Each object contains newline delimited JSON, and each line is a FHIR resource.



The BigQuery output destination.

The BigQuery location requires two IAM roles: roles/bigquery.dataEditor and roles/bigquery.jobUser.

The output is one BigQuery table per resource type.

Response body

If successful, the response body contains an instance of Operation.

Authorization Scopes

Requires one of the following OAuth scopes:


For more information, see the Authentication Overview.


The configuration for exporting to Cloud Storage.

JSON representation
  "uriPrefix": string


URI for a Cloud Storage directory where result files should be written (in the format gs://{bucket-id}/{path/to/destination/dir}). If there is no trailing slash, the service appends one when composing the object path. The user is responsible for creating the Cloud Storage bucket referenced in uriPrefix.