This page explains how to export and import FHIR resources to and from
Cloud Storage using the
projects.locations.datasets.fhirStores.import
and
projects.locations.datasets.fhirStores.export
methods.
Depending on the format of your FHIR data, to load data into a FHIR store, you
could use the
projects.locations.datasets.fhirStores.import
method or the
projects.locations.datasets.fhirStores.fhir.executeBundle
method. For guidance in choosing a method, see
FHIR import.
Setting Cloud Storage permissions
Before exporting and importing FHIR resources to and from Cloud Storage, you must grant additional permissions to the Cloud Healthcare Service Agent service account. For more information, see FHIR store Cloud Storage permissions.
Generating simulated patient data
Synthea™ is a simulator for generating patient population data. If you aren't using Synthea™ to generate patient population data, skip to Importing FHIR resources or Exporting FHIR resources.
You can only import data in the version your FHIR store is configured to accept.
To download and install Synthea™, complete the following steps:
Clone the Synthea™ tool repository from GitHub:
git clone https://github.com/synthetichealth/synthea.git
Complete the installation steps.
Continue to one of the following sections to generate data for a specific FHIR version:
Generating simulated patient data for R4
By default, generated Synthea™ data uses the FHIR R4 JSON representation for resources. To generate Synthea™ FHIR R4 data and import it into a Cloud Healthcare API FHIR store, complete the following steps:
Follow the instructions to generate synthetic patient data. The generated data outputs to the
synthea/output/fhir_r4
for FHIR R4.Copy the generated data to a Cloud Storage bucket so that you can import it into a Cloud Healthcare API FHIR store. For example, to copy the data to a directory called
synthea-data
in an existing Cloud Storage bucket, run the followinggcloud storage cp
command from thesynthea
directory:gcloud storage cp output/fhir_r4/* gs://BUCKET/synthea-data
Follow the instructions for importing FHIR resources.
Generating simulated patient data for DSTU2 or STU3
To generate Synthea™ FHIR DSTU2 or STU3 data and import it into a Cloud Healthcare API FHIR store, complete the following steps:
From the
synthea
directory, use a text editor to open thesrc/main/resources/synthea.properties
file and then make the following changes, depending on whether you are generating DSTU2 or STU3 data.To generate FHIR STU3 data:
- Set all
*.fhir.export
and*.fhir_dstu2.export
values tofalse
- Set all
*.fhir_stu3.export
values to true
To generate FHIR DSTU2 data:
- Set all
*.fhir.export
and*.fhir_stu3.export
values tofalse
- Set all
*.fhir_dstu2.export
values to true
For example, to generate FHIR STU3 data:
exporter.fhir.export = false exporter.fhir_stu3.export = true exporter.fhir_dstu2.export = false exporter.hospital.fhir.export = false exporter.hospital.fhir_stu3.export = true exporter.hospital.fhir_dstu2.export = false exporter.practitioner.fhir.export = false exporter.practitioner.fhir_stu3.export = true exporter.practitioner.fhir_dstu2.export = false
- Set all
Follow the instructions to generate synthetic patient data. The generated data outputs to the
synthea/output/fhir_stu3
for FHIR STU3 or thesynthea/output/fhir_dstu2
directory for FHIR DSTU2.Copy the generated data to a Cloud Storage bucket so that you can import it into a Cloud Healthcare API FHIR store. For example, to copy the data to a directory called
synthea-data
in an existing Cloud Storage bucket, run the followinggcloud storage cp
command from thesynthea
directory:gcloud storage cp output/fhir_stu3/* gs://BUCKET/synthea-data
Follow the instructions for importing FHIR resources.
Importing FHIR resources
When configuring the body of the import request, set the
ContentStructure
to one of the following values:
CONTENT_STRUCTURE_UNSPECIFIED
BUNDLE
: The source file contains one or more lines of newline-delimited JSON (ndjson). Each line is a bundle, which contains one or more resources. If you don't specifyContentStructure
, it defaults toBUNDLE
.RESOURCE
: The source file contains one or more lines of newline-delimited JSON (ndjson). Each line is a single resource.BUNDLE_PRETTY
: The entire source file is one JSON bundle. The JSON can span multiple lines.RESOURCE_PRETTY
: The entire source file is one JSON resource. The JSON can span multiple lines.
For example, suppose that you are importing a file named resources.ndjson
with the following content:
{"class":{"code":"IMP","display":"inpatient encounter","system":"http://hl7.org/fhir/v3/ActCode"},"id":"6090e773-3e91-40a7-8fce-1e22f6774c29","reason":[{"text":"The patient had an abnormal heart rate. She was concerned about this."}],"resourceType":"Encounter","status":"finished","subject":{"reference":"Patient/2938bb9e-1f16-429e-8d44-9508ab0e4151"}}
{"class":{"code":"IMP","display":"inpatient encounter","system":"http://hl7.org/fhir/v3/ActCode"},"id":"7101f884-4f02-51b8-9gdf-2f33g7885d30","reason":[{"text":"The patient was experiencing recurrent fevers."}],"resourceType":"Encounter","status":"finished","subject":{"reference":"Patient/3049cc0f-2g27-530f-9e55-0619bc1f5262"}}
{"birthDate":"1970-01-01","gender":"female","id":"2938bb9e-1f16-429e-8d44-9508ab0e4151","name":[{"family":"Smith","given":["Darcy"],"use":"official"}],"resourceType":"Patient"}
The file contains two Encounter resources and one Patient resource. Each
resource is on a separate line, so you set ContentStructure
to
RESOURCE
.
Your data might import incorrectly or not import at all if ContentStructure
does not match the format of your data. For example, the sample file above will
not import correctly unless ContentStructure
is set to RESOURCE
in the
import request.
The following samples show how to import FHIR resources from a Cloud Storage bucket.
Console
To import FHIR resources from a Cloud Storage bucket, complete the following steps:
- In the Google Cloud console, go to the Datasets page.
Go to Datasets - Click the dataset that contains the FHIR store to which you are importing FHIR resources.
- In the list of data stores, choose Import from the
Actions list for the FHIR store.
The Import to FHIR store page appears. - In the Project list, select a Cloud Storage project.
- In the Location list, select a Cloud Storage bucket.
- In FHIR import settings, select the relevant content structure.
- Click Import to import FHIR resources.
- To track the status of the operation, click the Operations tab. After the operation
completes, the following indications appear:
- The Long-running operation status section has a green check mark under the OK heading.
- The Overview section has a green check mark and an OK indicator in the same row as the operation ID.
gcloud
To import FHIR resources into a FHIR store, use the
gcloud healthcare fhir-stores import gcs
command. Specify the following information:
- The name of the parent dataset
- The name of the FHIR store
- The location of the object in a Cloud Storage bucket. The location of
the files within the bucket is arbitrary and does not have to adhere exactly
to the format specified in the following sample. When specifying the
location of the FHIR resources in Cloud Storage, you can use
wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.ndjson
matches Example.ndjson and Example22.ndjson in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .ndjson), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.ndjson
imports all files with the .ndjson filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.ndjson
matches Example1.ndjson but does not match Example.ndjson or Example01.ndjson.
- Use
The following sample shows the gcloud healthcare fhir-stores import gcs
command.
gcloud healthcare fhir-stores import gcs FHIR_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --gcs-uri=gs://BUCKET/DIRECTORY/FHIR_RESOURCE_NAME.ndjson
To specify the structure of the FHIR source files, use the --content-structure
flag.
The command line displays the operation ID and, after the operation completes,
done
:
Request issued for: [FHIR_STORE_ID] Waiting for operation [OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID
To view more details of the operation, run the
gcloud healthcare operations describe
command, providing the OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --dataset=DATASET_ID
The response includes done: true
.
done: true metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.fhir.FhirService.ImportResources createTime: 'CREATE_TIME' endTime: 'END_TIME' logsUrl: https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL, counter: success: 'SUCCESS_COUNT' name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID response: '@type': type.googleapis.com/google.cloud.healthcare.v1.fhir.rest.ImportResourcesResponse fhirStore: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID
API
To import FHIR resources into a FHIR store, use the
projects.locations.datasets.fhirStores.import
method.
- The location of the files within the bucket is arbitrary and does not have to adhere exactly to the format specified in the following samples.
- When specifying the location of the FHIR resources in Cloud Storage,
you can use wildcards to import multiple files from one or more directories.
The following wildcards are supported:
- Use
*
to match 0 or more non-separator characters. For example,gs://BUCKET/DIRECTORY/Example*.ndjson
matches Example.ndjson and Example22.ndjson in DIRECTORY. - Use
**
to match 0 or more characters (including separators). Must be used at the end of a path and with no other wildcards in the path. Can also be used with a filename extension (such as .ndjson), which imports all files with the filename extension in the specified directory and its subdirectories. For example,gs://BUCKET/DIRECTORY/**.ndjson
imports all files with the .ndjson filename extension in DIRECTORY and its subdirectories. - Use
?
to match 1 character. For example,gs://BUCKET/DIRECTORY/Example?.ndjson
matches Example1.ndjson but does not match Example.ndjson or Example01.ndjson.
- Use
curl
To import FHIR resources into a FHIR store, make a POST
request and specify
the following information:
- The name of the parent dataset
- The name of the FHIR store
- The location of the object in a Cloud Storage bucket
- An access token
The following sample shows how to import a single file using a POST
request
using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'contentStructure': 'CONTENT_STRUCTURE', 'gcsSource': { 'uri': 'gs://BUCKET/DIRECTORY/FHIR_RESOURCE_FILE' } }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID:import"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.fhir.FhirService.ImportResources", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter": { "success": "SUCCESS_COUNT" } }, "done": true, "response": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.fhir.rest.ImportResourcesResponse", } }
PowerShell
To import FHIR resources into a FHIR store, make a POST
request and specify
the following information:
- The name of the parent dataset
- The name of the FHIR store
- The location of the object in a Cloud Storage bucket
- An access token
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'contentStructure': 'CONTENT_STRUCTURE', 'gcsSource': { 'uri': 'gs://BUCKET/DIRECTORY/FHIR_RESOURCE_FILE' } }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID:import" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.fhir.FhirService.ImportResources", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter": { "success": "SUCCESS_COUNT" } }, "done": true, "response": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.fhir.rest.ImportResourcesResponse", } }
Go
Java
Node.js
Python
Exporting FHIR resources
The following samples show how to export FHIR resources to a Cloud Storage bucket. When you export FHIR resources from a FHIR store, all resources in the FHIR store are exported.
If you export FHIR resources on a schedule, consider exporting your data incrementally. For instructions, see Incremental exports.
During export, the Cloud Healthcare API creates one file for each resource
type from the FHIR store. The filename consists of the operation ID and the
resource type separated by an underscore. Each file consists of
newline-delimited JSON where each line is a FHIR resource corresponding to the
resource type in the filename. For example, if you export multiple Patient
records, the output file will be called something similar to
1264567891234567_Patient
and will contain a line for each Patient resource
from the FHIR store.
Console
To export FHIR resources to Cloud Storage, complete the following steps:
- In the Google Cloud console, go to the Datasets page.
Go to Datasets - Click the dataset that contains the FHIR store from which you are exporting FHIR resources.
- In the list of data stores, choose Export from the Actions list
for the FHIR store.
The Export FHIR resources page displays. - Select Google Cloud Storage Bucket.
- In the Project list, select a Cloud Storage project.
- In the Location list, select a Cloud Storage bucket.
- Click Export to export FHIR resources to the defined location in Cloud Storage.
- To track the status of the operation, click the Operations tab. After the operation
completes, the following indications appear:
- The Long-running operation status section has a green check mark under the OK heading.
- The Overview section has a green check mark and an OK indicator in the same row as the operation ID.
gcloud
To export FHIR resources to a Cloud Storage bucket, use the
gcloud healthcare fhir-stores export gcs
command. Specify the following information:
- The name of the parent dataset
- The name of the FHIR store
- The name of the parent project
- The destination Cloud Storage bucket or directory. Write to a Cloud Storage bucket or directory, rather than an object, because the Cloud Healthcare API creates one object for each resource type. Each object consists of newline-delimited JSON where each line is a FHIR resource. If you specify a directory that doesn't exist, it's created.
- An optional flag,
--resource-type
, that only exports specific resource types, defined as a comma-separated list of one or more FHIR resources types - An optional flag,
--since
, that only exports resources updated after a specific time, defined asYYYY-MM-DDThh:mm:ss.sss+zz:zz
The following sample shows the gcloud healthcare fhir-stores export gcs
command.
gcloud healthcare fhir-stores export gcs FHIR_STORE_ID \ --dataset=DATASET_ID \ --location=LOCATION \ --project=PROJECT_ID --gcs-uri=gs://BUCKET/DIRECTORY
The command line displays the operation ID:
Waiting for operation [OPERATION_ID] to complete...done. name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID
To view the status of the operation, run the
gcloud healthcare operations describe
command, providing the OPERATION_ID from the response:
gcloud healthcare operations describe OPERATION_ID \ --dataset=DATASET_ID
After the command completes, the response includes done
.
metadata: '@type': type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata apiMethodName: google.cloud.healthcare.v1.fhir.FhirService.ExportFhirData createTime: "CREATE_TIME" endTime: "END_TIME" name: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID response: '@type': type.googleapis.com/google.cloud.healthcare.v1.fhir.rest.ExportResourcesResponse fhirStore: projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID resourceCount: 'RESOURCE_COUNT'
API
To export FHIR resources, use the
projects.locations.datasets.fhirStores.export
method.
- Write to a Cloud Storage bucket or directory, rather than an object, because the Cloud Healthcare API creates one newline-delimited JSON file for each resource type. In each JSON file, each line is a FHIR resource.
- If the command specifies a directory that does not exist, the directory is created.
curl
To export FHIR resources, make a POST
request and specify
the following information:
- The name of the parent dataset
- The name of the FHIR store
- The destination Cloud Storage bucket
- An access token
- An optional field,
_type
, that only exports specific resource types, defined as a comma-separated list of one or more FHIR resources types - An optional field,
_since
, that only exports resources updated after a specific time, defined asYYYY-MM-DDThh:mm:ss.sss+zz:zz
The following sample shows a POST
request using curl
.
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' }, }" "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID:export"
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
curl -X GET \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID"
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.fhir.FhirService.ExportResources", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter": { "success": "SUCCESS_COUNT" } }, "done": true, "response": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.fhir.rest.ExportResourcesResponse", } }
PowerShell
To export FHIR resources, make a POST
request and specify
the following information:
- The name of the parent dataset
- The name of the FHIR store
- The destination Cloud Storage bucket or directory. Write to a Cloud Storage bucket or directory, rather than an object, because the Cloud Healthcare API creates one object for each resource type. Each object consists of newline-delimited JSON where each line is a FHIR resource.
- An access token
- An optional field,
_type
, that only exports specific resource types, defined as a comma-separated list of one or more FHIR resources types - An optional field,
_since
, that only exports resources updated after a specific time, defined asYYYY-MM-DDThh:mm:ss.sss+zz:zz
The following sample shows a POST
request using Windows PowerShell.
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Post ` -Headers $headers ` -ContentType: "application/json; charset=utf-8" ` -Body "{ 'gcsDestination': { 'uriPrefix': 'gs://BUCKET/DIRECTORY' }, }" ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/fhirStores/FHIR_STORE_ID:export" | Select-Object -Expand Content
If the request is successful, the server returns the response in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" }
The response contains an operation name. To track the status of the operation,
you can use the
Operation get
method:
$cred = gcloud auth application-default print-access-token $headers = @{ Authorization = "Bearer $cred" } Invoke-WebRequest ` -Method Get ` -Headers $headers ` -Uri "https://healthcare.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID" | Select-Object -Expand Content
If the request is successful, the server returns a response with the status of the operation in JSON format:
{ "name": "projects/PROJECT_ID/locations/LOCATION/datasets/DATASET_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.OperationMetadata", "apiMethodName": "google.cloud.healthcare.v1.fhir.FhirService.ExportResources", "createTime": "CREATE_TIME", "endTime": "END_TIME", "logsUrl": "https://console.cloud.google.com/logs/query/CLOUD_LOGGING_URL", "counter": { "success": "SUCCESS_COUNT" } }, "done": true, "response": { "@type": "type.googleapis.com/google.cloud.healthcare.v1.fhir.rest.ExportResourcesResponse", } }
Go
Java
Node.js
Python
Incremental exports
You can specify a timestamp to only export FHIR resources added to your FHIR store since a previous successful export. This improves performance and avoids the cost of re-exporting the entire FHIR store, while also ensuring your exported data is always up-to-date.
When calling fhirStores.export
,
specify the timestamp in the _since
field.
Troubleshooting FHIR import and export requests
If errors occur during a FHIR import or export request, the errors are logged to Cloud Logging. For more information, see Viewing error logs in Cloud Logging.
If the entire operation returns an error, see Troubleshooting long-running operations.
What's next
- If you've successfully imported FHIR data and want to analyze the data in BigQuery, continue to Exporting FHIR data to BigQuery.