This template is deprecated and will be removed in Q3 2023. Please migrate to Firestore to Cloud Storage Text template.
The Datastore to Cloud Storage Text template is a batch pipeline that reads Datastore entities and writes them to Cloud Storage as text files. You can provide a function to process each entity as a JSON string. If you don't provide such a function, every line in the output file will be a JSON-serialized entity.
Pipeline requirements
Datastore must be set up in the project before running the pipeline.
Template parameters
Parameter | Description |
---|---|
datastoreReadGqlQuery |
A GQL query that specifies which
entities to grab. For example, SELECT * FROM MyKind . |
datastoreReadProjectId |
The Google Cloud project ID of the Datastore instance that you want to read data from. |
datastoreReadNamespace |
The namespace of the requested entities. To use the default namespace, leave this parameter blank. |
javascriptTextTransformGcsPath |
(Optional)
The Cloud Storage URI of the .js file that defines the JavaScript user-defined
function (UDF) you want to use. For example, gs://my-bucket/my-udfs/my_file.js .
|
javascriptTextTransformFunctionName |
(Optional)
The name of the JavaScript user-defined function (UDF) that you want to use.
For example, if your JavaScript function code is
myTransform(inJson) { /*...do stuff...*/ } , then the function name is
myTransform . For sample JavaScript UDFs, see
UDF Examples.
|
textWritePrefix |
The Cloud Storage path prefix to specify where the data is written. For example,
gs://mybucket/somefolder/ . |
Run the template
Console
- Go to the Dataflow Create job from template page. Go to Create job from template
- In the Job name field, enter a unique job name.
- Optional: For Regional endpoint, select a value from the drop-down menu. The default
regional endpoint is
us-central1
.For a list of regions where you can run a Dataflow job, see Dataflow locations.
- From the Dataflow template drop-down menu, select the Datastore to Text Files on Cloud Storage template.
- In the provided parameter fields, enter your parameter values.
- Click Run job.
gcloud
In your shell or terminal, run the template:
gcloud dataflow jobs run JOB_NAME \ --gcs-location gs://dataflow-templates/VERSION/Datastore_to_GCS_Text \ --region REGION_NAME \ --parameters \ datastoreReadGqlQuery="SELECT * FROM DATASTORE_KIND",\ datastoreReadProjectId=DATASTORE_PROJECT_ID,\ datastoreReadNamespace=DATASTORE_NAMESPACE,\ javascriptTextTransformGcsPath=PATH_TO_JAVASCRIPT_UDF_FILE,\ javascriptTextTransformFunctionName=JAVASCRIPT_FUNCTION,\ textWritePrefix=gs://BUCKET_NAME/output/
Replace the following:
JOB_NAME
: a unique job name of your choiceREGION_NAME
: the regional endpoint where you want to deploy your Dataflow job—for example,us-central1
VERSION
: the version of the template that you want to useYou can use the following values:
latest
to use the latest version of the template, which is available in the non-dated parent folder in the bucket— gs://dataflow-templates/latest/- the version name, like
2021-09-20-00_RC00
, to use a specific version of the template, which can be found nested in the respective dated parent folder in the bucket— gs://dataflow-templates/
BUCKET_NAME
: the name of your Cloud Storage bucketDATASTORE_PROJECT_ID
: the Google Cloud project ID where the Datastore instance existsDATASTORE_KIND
: the type of your Datastore entitiesDATASTORE_NAMESPACE
: the namespace of your Datastore entitiesJAVASCRIPT_FUNCTION
: the name of the JavaScript user-defined function (UDF) that you want to useFor example, if your JavaScript function code is
myTransform(inJson) { /*...do stuff...*/ }
, then the function name ismyTransform
. For sample JavaScript UDFs, see UDF Examples.PATH_TO_JAVASCRIPT_UDF_FILE
: the Cloud Storage URI of the.js
file that defines the JavaScript user-defined function (UDF) you want to use—for example,gs://my-bucket/my-udfs/my_file.js
API
To run the template using the REST API, send an HTTP POST request. For more information on the
API and its authorization scopes, see
projects.templates.launch
.
POST https://dataflow.googleapis.com/v1b3/projects/PROJECT_ID/locations/LOCATION/templates:launch?gcsPath=gs://dataflow-templates/VERSION/Datastore_to_GCS_Text { "jobName": "JOB_NAME", "parameters": { "datastoreReadGqlQuery": "SELECT * FROM DATASTORE_KIND" "datastoreReadProjectId": "DATASTORE_PROJECT_ID", "datastoreReadNamespace": "DATASTORE_NAMESPACE", "javascriptTextTransformGcsPath": "PATH_TO_JAVASCRIPT_UDF_FILE", "javascriptTextTransformFunctionName": "JAVASCRIPT_FUNCTION", "textWritePrefix": "gs://BUCKET_NAME/output/" }, "environment": { "zone": "us-central1-f" } }
Replace the following:
PROJECT_ID
: the Google Cloud project ID where you want to run the Dataflow jobJOB_NAME
: a unique job name of your choiceLOCATION
: the regional endpoint where you want to deploy your Dataflow job—for example,us-central1
VERSION
: the version of the template that you want to useYou can use the following values:
latest
to use the latest version of the template, which is available in the non-dated parent folder in the bucket— gs://dataflow-templates/latest/- the version name, like
2021-09-20-00_RC00
, to use a specific version of the template, which can be found nested in the respective dated parent folder in the bucket— gs://dataflow-templates/
BUCKET_NAME
: the name of your Cloud Storage bucketDATASTORE_PROJECT_ID
: the Google Cloud project ID where the Datastore instance existsDATASTORE_KIND
: the type of your Datastore entitiesDATASTORE_NAMESPACE
: the namespace of your Datastore entitiesJAVASCRIPT_FUNCTION
: the name of the JavaScript user-defined function (UDF) that you want to useFor example, if your JavaScript function code is
myTransform(inJson) { /*...do stuff...*/ }
, then the function name ismyTransform
. For sample JavaScript UDFs, see UDF Examples.PATH_TO_JAVASCRIPT_UDF_FILE
: the Cloud Storage URI of the.js
file that defines the JavaScript user-defined function (UDF) you want to use—for example,gs://my-bucket/my-udfs/my_file.js