gcloud beta ml-engine predict

NAME
gcloud beta ml-engine predict - run AI Platform online prediction
SYNOPSIS
gcloud beta ml-engine predict --model=MODEL (--json-instances=JSON_INSTANCES     | --json-request=JSON_REQUEST     | --text-instances=TEXT_INSTANCES) [--region=REGION] [--signature-name=SIGNATURE_NAME] [--version=VERSION] [GCLOUD_WIDE_FLAG]
DESCRIPTION
(BETA) gcloud beta ml-engine predict sends a prediction request to AI Platform for the given instances. This command will read up to 100 instances, though the service itself will accept instances up to the payload limit size (currently, 1.5MB). If you are predicting on more instances, you should use batch prediction via
gcloud beta ml-engine jobs submit prediction.
REQUIRED FLAGS
--model=MODEL
Name of the model.
Exactly one of these must be specified:
--json-instances=JSON_INSTANCES
Path to a local file from which instances are read. Instances are in JSON format; newline delimited.

An example of the JSON instances file:

{"images": [0.0, …, 0.1], "key": 3}
{"images": [0.0, …, 0.1], "key": 2}

This flag accepts "-" for stdin.

--json-request=JSON_REQUEST
Path to a local file containing the body of JSON request.

An example of a JSON request:

{
  "instances": [
    {"x": [1, 2], "y": [3, 4]},
    {"x": [-1, -2], "y": [-3, -4]}
  ]
}

This flag accepts "-" for stdin.

--text-instances=TEXT_INSTANCES
Path to a local file from which instances are read. Instances are in UTF-8 encoded text format; newline delimited.

An example of the text instances file:

107,4.9,2.5,4.5,1.7
100,5.7,2.8,4.1,1.3
…

This flag accepts "-" for stdin.

OPTIONAL FLAGS
--region=REGION
Google Cloud region of the regional endpoint to use for this command. For the global endpoint, the region needs to be specified as global.

Learn more about regional endpoints and see a list of available regions: https://cloud.google.com/ai-platform/prediction/docs/regional-endpoints

REGION must be one of: global, asia-east1, asia-northeast1, asia-southeast1, australia-southeast1, europe-west1, europe-west2, europe-west3, europe-west4, northamerica-northeast1, us-central1, us-east1, us-east4, us-west1.

--signature-name=SIGNATURE_NAME
Name of the signature defined in the SavedModel to use for this job. Defaults to DEFAULT_SERVING_SIGNATURE_DEF_KEY in https://www.tensorflow.org/api_docs/python/tf/compat/v1/saved_model/signature_constants, which is "serving_default". Only applies to TensorFlow models.
--version=VERSION
Model version to be used.

If unspecified, the default version of the model will be used. To list model versions run

gcloud beta ml-engine versions list
GCLOUD WIDE FLAGS
These flags are available to all commands: --access-token-file, --account, --billing-project, --configuration, --flags-file, --flatten, --format, --help, --impersonate-service-account, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity.

Run $ gcloud help for details.

NOTES
This command is currently in beta and might change without notice. These variants are also available:
gcloud ml-engine predict
gcloud alpha ml-engine predict