Exporting Edge models

After you have created (trained) a model, you can export your custom model.

After exporting your model you can then deploy the model to a device.

You can export an image classification model in either generic Tensorflow Lite format, Edge TPU compiled TensorFlow Lite format, general TensorFlow format, or TensorFlow.js for Web format to a Google Cloud Storage location using the ExportModel API.

Export to devices

TensorFlow Lite models

Web UI

  1. Open the Vision Dashboard and click the lightbulb icon in the left navigation bar to display the available models.

    To view the models for a different project, select the project from the drop-down list in the upper right of the title bar.

  2. Click the row for the model you want to use to label your images.

  3. Click the Test & Use tab just below the title bar.

  4. In the Use your model section, select the TF Lite option. After selecting the TF Lite option and specifying the export location on Cloud Storage in the side window, select Export to export your Edge TF Lite model.

    updated Export TF Lite model option updated Export TF Lite model option

REST & CMD LINE

In the "modelFormat" field specify "tflite" (default).

Before using any of the request data below, make the following replacements:

  • project-id: your GCP project ID.
  • model-id: the ID of your model, from the response when you created the model. The ID is the last element of the name of your model. For example:
    • model name: projects/project-id/locations/location-id/models/IOD4412217016962778756
    • model id: IOD4412217016962778756
  • output-storage-bucket: a Google Cloud Storage bucket/directory to save output files to, expressed in the following form: gs://bucket/directory/. The requesting user must have write permission to the bucket.

HTTP method and URL:

POST https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

Request JSON body:

{
  "outputConfig": {
    "modelFormat": "tflite",
    "gcsDestination": {
      "outputUriPrefix": "output-storage-bucket/"
    },
  }
}

To send your request, choose one of these options:

curl

Save the request body in a file called request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

PowerShell

Save the request body in a file called request.json, and execute the following command:

$cred = gcloud auth application-default print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{
  "name": "projects/project-id/locations/us-central1/operations/operation-id",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.automl.v1.OperationMetadata",
    "createTime": "2019-07-22T21:23:21.643041Z",
    "updateTime": "2019-07-22T21:23:21.643041Z",
    "exportModelDetails": {
      "outputInfo": {
        "gcsOutputDirectory": "output-storage-bucket/model-export/icn/tflite-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ"
      }
    }
  }
}

As a result you will see a folder structure in the directory you provided (cloud-storage-bucket/[directory]). The created folder structure will have the following general format (timestamp in ISO-8601 format):

  • cloud-storage-bucket/model-export/icn/model-type-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ

For example:

  • cloud-storage-bucket/model-export/icn/tf-saved-model-dataset-name-2019-07-22T21:25:35.135Z
  • cloud-storage-bucket/model-export/icn/tflite-dataset-name-2019-07-22T21:23:18.861Z

The folder contains a TensorFlow Lite model named model.tflite, a label file named dict.txt, and a tflite_metadata.json file.

Using the exported model

After exporting your model to a Google Cloud Storage bucket you can deploy your AutoML Vision Edge model on Android devices, iOS devices, or Raspberry Pi 3.

Core ML models

Web UI

  1. Open the Vision Dashboard and click the lightbulb icon in the left navigation bar to display the available models.

    To view the models for a different project, select the project from the drop-down list in the upper right of the title bar.

  2. Click the row for the model you want to use to label your images.

  3. Click the Test & Use tab just below the title bar.

  4. In the Use your model section, select the Core ML option. After selecting the Core ML option and specifying the export location on Cloud Storage in the side window, click Export to export your Edge model.

    export Core ML model option export Core ML model option

REST & CMD LINE

In the "modelFormat" field specify "core_ml".

Before using any of the request data below, make the following replacements:

  • project-id: your GCP project ID.
  • model-id: the ID of your model, from the response when you created the model. The ID is the last element of the name of your model. For example:
    • model name: projects/project-id/locations/location-id/models/IOD4412217016962778756
    • model id: IOD4412217016962778756
  • output-storage-bucket: a Google Cloud Storage bucket/directory to save output files to, expressed in the following form: gs://bucket/directory/. The requesting user must have write permission to the bucket.

HTTP method and URL:

POST https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

Request JSON body:

{
  "outputConfig": {
    "modelFormat": "core_ml",
    "gcsDestination": {
      "outputUriPrefix": "output-storage-bucket/"
    },
  }
}

To send your request, choose one of these options:

curl

Save the request body in a file called request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

PowerShell

Save the request body in a file called request.json, and execute the following command:

$cred = gcloud auth application-default print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{
  "name": "projects/project-id/locations/us-central1/operations/operation-id",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.automl.v1.OperationMetadata",
    "createTime": "2019-11-12T22:53:55.290584Z",
    "updateTime": "2019-11-12T22:53:55.290584Z",
    "exportModelDetails": {
      "outputInfo": {
        "gcsOutputDirectory": "output-storage-bucket/model-export/icn/core_ml-dataset-name_YYYY-MM-DDThh:mm:ss.sssZ/"
      }
    }
  }
}

As a result you will see a folder structure in the directory you provided (cloud-storage-bucket/[directory]). The created folder structure will have the following general format (timestamp in ISO-8601 format):

  • cloud-storage-bucket/model-export/icn/model-type-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ

For example:

  • cloud-storage-bucket/model-export/icn/tf-saved-model-dataset-name-2019-07-22T21:25:35.135Z
  • cloud-storage-bucket/model-export/icn/tflite-dataset-name-2019-07-22T21:23:18.861Z

The folder contains a Core ML model named model.mlmodel and a dict.txt label file.

Using the exported model

After exporting your model to a Google Cloud Storage bucket you can deploy your AutoML Vision Edge model on iOS devices.

Export to a container

Web UI

  1. Open the Vision Dashboard and click the lightbulb icon in the left navigation bar to display the available models.

    To view the models for a different project, select the project from the drop-down list in the upper right of the title bar.

  2. Click the row for the model you want to use to label your images.

  3. Click the Test & Use tab just below the title bar.

  4. In the Use your model section, select the Container option. After selecting the Container option and specifying the export location on Cloud Storage in the side window, click Export to export your Edge model.

    export to container option export to container option

REST & CMD LINE

In the "modelFormat" field specify "tf-saved-model".

Before using any of the request data below, make the following replacements:

  • project-id: your GCP project ID.
  • model-id: the ID of your model, from the response when you created the model. The ID is the last element of the name of your model. For example:
    • model name: projects/project-id/locations/location-id/models/IOD4412217016962778756
    • model id: IOD4412217016962778756
  • output-storage-bucket: a Google Cloud Storage bucket/directory to save output files to, expressed in the following form: gs://bucket/directory/. The requesting user must have write permission to the bucket.

HTTP method and URL:

POST https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/${model-id}:export

Request JSON body:

{
  "outputConfig": {
    "modelFormat": "tf-saved-model",
    "gcsDestination": {
      "outputUriPrefix": "output-storage-bucket/"
    },
  }
}

To send your request, choose one of these options:

curl

Save the request body in a file called request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/${model-id}:export

PowerShell

Save the request body in a file called request.json, and execute the following command:

$cred = gcloud auth application-default print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/${model-id}:export" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{
  "name": "projects/project-id/locations/us-central1/operations/operation-id",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.automl.v1.OperationMetadata",
    "createTime": "2019-07-22T21:23:21.643041Z",
    "updateTime": "2019-07-22T21:23:21.643041Z",
    "exportModelDetails": {
      "outputInfo": {
        "gcsOutputDirectory": "output-storage-bucket/model-export/icn/tf-saved-model-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ"
      }
    }
  }
}

As a result you will see a folder structure in the directory you provided (cloud-storage-bucket/[directory]). The created folder structure will have the following general format (timestamp in ISO-8601 format):

  • cloud-storage-bucket/model-export/icn/model-type-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ

For example:

  • cloud-storage-bucket/model-export/icn/tf-saved-model-dataset-name-2019-07-22T21:25:35.135Z
  • cloud-storage-bucket/model-export/icn/tflite-dataset-name-2019-07-22T21:23:18.861Z

The folder contains a TensorFlow model named saved_model.pb.

Using the exported model

After exporting your model to a Google Cloud Storage bucket you can use your exported model to make predictions in a Docker image. See the Containers tutorial for instructions on deployment to a container.

Export to Edge TPU

Web UI

  1. Open the Vision Dashboard and click the lightbulb icon in the left navigation bar to display the available models.

    To view the models for a different project, select the project from the drop-down list in the upper right of the title bar.

  2. Select the row for the model you want to use to label your images.

  3. Select the Test & Use tab just below the title bar.

  4. In the Use your model section, select the Coral option. After selecting the Coral option and specifying the export location on Cloud Storage in the side window, click Export to export your Edge model.

    export coral (edgetpu tflite) option export coral (edgetpu tflite) option

REST & CMD LINE

In the "modelFormat" field specify "edgetpu_tflite".

Before using any of the request data below, make the following replacements:

  • project-id: your GCP project ID.
  • model-id: the ID of your model, from the response when you created the model. The ID is the last element of the name of your model. For example:
    • model name: projects/project-id/locations/location-id/models/IOD4412217016962778756
    • model id: IOD4412217016962778756
  • output-storage-bucket: a Google Cloud Storage bucket/directory to save output files to, expressed in the following form: gs://bucket/directory/. The requesting user must have write permission to the bucket.

HTTP method and URL:

POST https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

Request JSON body:

{
  "outputConfig": {
    "modelFormat": "edgetpu_tflite",
    "gcsDestination": {
      "outputUriPrefix": "output-storage-bucket/"
    },
  }
}

To send your request, choose one of these options:

curl

Save the request body in a file called request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

PowerShell

Save the request body in a file called request.json, and execute the following command:

$cred = gcloud auth application-default print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{
  "name": "projects/project-id/locations/us-central1/operations/operation-id",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.automl.v1.OperationMetadata",
    "createTime": "2019-11-12T22:55:03.554806Z",
    "updateTime": "2019-11-12T22:55:03.554806Z",
    "exportModelDetails": {
      "outputInfo": {
        "gcsOutputDirectory": "output-storage-bucket/model-export/icn/edgetpu_tflite-dataset-name_YYYY-MM-DDThh:mm:ss.sssZ/"
      }
    }
  }
}

As a result you will see a folder structure in the directory you provided (cloud-storage-bucket/[directory]). The created folder structure will have the following general format (timestamp in ISO-8601 format):

  • cloud-storage-bucket/model-export/icn/model-type-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ

For example:

  • cloud-storage-bucket/model-export/icn/tf-saved-model-dataset-name-2019-07-22T21:25:35.135Z
  • cloud-storage-bucket/model-export/icn/tflite-dataset-name-2019-07-22T21:23:18.861Z

The folder contains a TensorFlow Lite model named edgetpu_model.tflite, a label file named dict.txt, and a tflite_metadata.json file.

Using the exported model

For more information on how to deploy to Edge TPU, see Coral's official documentation about how to run an inference on the Edge TPU.

Export for Web

Web UI

  1. Open the Vision Dashboard and select the lightbulb icon in the side navigation bar to display the available models.

    To view the models for a different project, select the project from the drop-down list in the upper right of the title bar.

  2. Select the row for the model you want to use to label your images.

  3. Select the Test & Use tab just below the title bar.

  4. In the Use your model section select the Tensorflow.js option. After selecting the Tensorflow.js option and specifying the export location on Cloud Storage in the side window, select Export to export your Web-ready TensorFlow.js model.

    export Tensorflow.js option export Tensorflow.js option

REST & CMD LINE

Before using any of the request data below, make the following replacements:

  • project-id: your GCP project ID.
  • model-id: the ID of your model, from the response when you created the model. The ID is the last element of the name of your model. For example:
    • model name: projects/project-id/locations/location-id/models/IOD4412217016962778756
    • model id: IOD4412217016962778756
  • output-storage-bucket: a Google Cloud Storage bucket/directory to save output files to, expressed in the following form: gs://bucket/directory/. The requesting user must have write permission to the bucket.

HTTP method and URL:

POST https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

Request JSON body:

{
  "outputConfig": {
    "modelFormat": "tf_js",
    "gcsDestination": {
      "outputUriPrefix": "output-storage-bucket/"
    },
  }
}

To send your request, choose one of these options:

curl

Save the request body in a file called request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export

PowerShell

Save the request body in a file called request.json, and execute the following command:

$cred = gcloud auth application-default print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://automl.googleapis.com/v1/projects/project-id/locations/us-central1/models/model-id:export" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{
  "name": "projects/project-id/locations/us-central1/operations/operation-id",
  "metadata": {
    "@type": "type.googleapis.com/google.cloud.automl.v1.OperationMetadata",
    "createTime": "2019-07-22T21:23:21.643041Z",
    "updateTime": "2019-07-22T21:23:21.643041Z",
    "exportModelDetails": {
      "outputInfo": {
        "gcsOutputDirectory": "output-storage-bucket/model-export/icn/tf_js-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ"
      }
    }
  }
}

As a result you will see a folder in the directory you provided (${USER_GCS_PATH}). The created folder will be named according to timestamp in the format /model-export/icn/tf_js-dataset-name-YYYY-MM-DDThh:mm:ss.sssZ (for example, tf_js-edge_model-2019-10-03T17:24:46.999Z).

The folder contains binary files (.bin), a label file named dict.txt, and a model.json file.

Tensorflow.js folder example