Output configuration for ModelExport Action.
The Google Cloud Storage location where the model is to be written to. This location may only be set for the following model formats: "tflite", "edgetpu_tflite", "tf_saved_model", "tf_js", "core_ml". Under the directory given as the destination a new one with name "model-export--", where timestamp is in YYYY-MM-DDThh:mm:ss.sssZ ISO-8601 format, will be created. Inside the model and any of its supporting files will be written.
The format in which the model must be exported. The available,
and default, formats depend on the problem and model type (if
given problem and type combination doesn't have a format
listed, it means its models are not exportable): - For Image
Classification mobile-low-latency-1, mobile-versatile-1,
mobile-high-accuracy-1: "tflite" (default), "edgetpu_tflite",
"tf_saved_model", "tf_js", "docker". - For Image
Classification mobile-core-ml-low-latency-1, mobile-core-
ml-versatile-1, mobile-core-ml-high-accuracy-1: "core_ml"
(default). Formats description: - tflite - Used for Android
mobile devices. - edgetpu_tflite - Used for Edge TPU
<https://cloud.google.com/edge-tpu/>
devices. -
tf_saved_model - A tensorflow model in SavedModel format. -
tf_js - A TensorFlow.js <https://www.tensorflow.org/js>
model that can be used in the browser and in Node.js using
JavaScript. - docker - Used for Docker containers. Use the
params field to customize the container. The container is
verified to work correctly on ubuntu 16.04 operating
system. See more at [containers quickstart](https:
//cloud.google.com/vision/automl/docs/containers-gcs-
quickstart) * core_ml - Used for iOS mobile devices.
Classes
ParamsEntry
API documentation for automl_v1beta1.types.ModelExportOutputConfig.ParamsEntry
class.