DatastoreV1.Read (Google Cloud Dataflow SDK 1.9.1 API)

Google Cloud Dataflow SDK for Java, version 1.9.1

com.google.cloud.dataflow.sdk.io.datastore

Class DatastoreV1.Read

    • Field Detail

      • NUM_QUERY_SPLITS_MAX

        public static final int NUM_QUERY_SPLITS_MAX
        An upper bound on the number of splits for a query.
        See Also:
        Constant Field Values
    • Method Detail

      • withQuery

        public DatastoreV1.Read withQuery(com.google.datastore.v1.Query query)
        Returns a new DatastoreV1.Read that reads the results of the specified query.

        Note: Normally, DatastoreIO will read from Cloud Datastore in parallel across many workers. However, when the Query is configured with a limit using Query.Builder#setLimit, then all results will be read by a single worker in order to ensure correct results.

      • withNumQuerySplits

        public DatastoreV1.Read withNumQuerySplits(int numQuerySplits)
        Returns a new DatastoreV1.Read that reads by splitting the given query into numQuerySplits.

        The semantics for the query splitting is defined below:

        • Any value less than or equal to 0 will be ignored, and the number of splits will be chosen dynamically at runtime based on the query data size.
        • Any value greater than NUM_QUERY_SPLITS_MAX will be capped at NUM_QUERY_SPLITS_MAX.
        • If the query has a user limit set, then numQuerySplits will be ignored and no split will be performed.
        • Under certain cases Cloud Datastore is unable to split query to the requested number of splits. In such cases we just use whatever the Datastore returns.
      • getQuery

        @Nullable
        public com.google.datastore.v1.Query getQuery()
      • apply

        public PCollection<com.google.datastore.v1.Entity> apply(PBegin input)
        Applies this PTransform on the given InputT, and returns its Output.

        Composite transforms, which are defined in terms of other transforms, should return the output of one of the composed transforms. Non-composite transforms, which do not apply any transforms internally, should return a new unbound output and register evaluators (via backend-specific registration methods).

        The default implementation throws an exception. A derived class must either implement apply, or else each runner must supply a custom implementation via PipelineRunner.apply(com.google.cloud.dataflow.sdk.transforms.PTransform<InputT, OutputT>, InputT).

        Overrides:
        apply in class PTransform<PBegin,PCollection<com.google.datastore.v1.Entity>>
      • validate

        public void validate(PBegin input)
        Description copied from class: PTransform
        Called before invoking apply (which may be intercepted by the runner) to verify this transform is fully specified and applicable to the specified input.

        By default, does nothing.

        Overrides:
        validate in class PTransform<PBegin,PCollection<com.google.datastore.v1.Entity>>
      • populateDisplayData

        public void populateDisplayData(DisplayData.Builder builder)
        Description copied from class: PTransform
        Register display data for the given transform or component.

        populateDisplayData(DisplayData.Builder) is invoked by Pipeline runners to collect display data via DisplayData.from(HasDisplayData). Implementations may call super.populateDisplayData(builder) in order to register display data in the current namespace, but should otherwise use subcomponent.populateDisplayData(builder) to use the namespace of the subcomponent.

        By default, does not register any display data. Implementors may override this method to provide their own display data.

        Specified by:
        populateDisplayData in interface HasDisplayData
        Overrides:
        populateDisplayData in class PTransform<PBegin,PCollection<com.google.datastore.v1.Entity>>
        Parameters:
        builder - The builder to populate with display data.
        See Also:
        HasDisplayData


Was this page helpful? Let us know how we did:

Send feedback about...

Cloud Dataflow
Need help? Visit our support page.