Edge device model quickstart

This quickstart walks you through the process of:

  • Copying a set of images into Google Cloud Storage.
  • Creating a CSV listing the images and their labels.
  • Using AutoML Vision to create your dataset, train a custom AutoML Vision Edge model, and make a prediction.
  • Exporting and deploying your AutoML Vision Edge model.

Before you begin

Set up your project

  1. Sign in to your Google Account.

    If you don't already have one, sign up for a new account.

  2. In the Cloud Console, on the project selector page, select or create a Cloud project.

    Go to the project selector page

  3. Make sure that billing is enabled for your Google Cloud project. Learn how to confirm billing is enabled for your project.

  4. Enable the AutoML and Cloud Storage APIs.

    Enable the APIs

  5. Install the gcloud command line tool.
  6. Follow the instructions to create a service account and download a key file for that account.
  7. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path to the service account key file that you downloaded when you created the service account.
  8. Set the PROJECT_ID environment variable to your Project ID.
    export PROJECT_ID=your-project-id
    The AutoML API calls and resource names include your Project ID in them. The PROJECT_ID environment variable provides a convenient way to specify the ID.
  9. If you are an owner for your project, add your service account to the AutoML Editor IAM role, replacing service-account-name with the name of your new service account. For example, service-account1@myproject.iam.gserviceaccount.com.
    gcloud auth login
    gcloud projects add-iam-policy-binding $PROJECT_ID \
       --member="serviceAccount:service-account-name" \
  10. Otherwise (if you are not a project owner), ask a project owner to add both your user ID and your service account to the AutoML Editor IAM role.

Create a Cloud Storage bucket

Use Cloud Shell, a browser-based Linux command line connected to your Cloud Console project, to create your Cloud Storage bucket:

  1. Open Cloud Shell.

  2. Create a Google Cloud Storage bucket. The bucket name must be in the format: project-id-vcm. The following command creates a storage bucket in the us-central1 region named project-id-vcm. For a complete list of available regions, see the Bucket Locations page.

    gsutil mb -p project-id -c regional -l us-central1 gs://project-id-vcm/

    Recommended file structure for your Cloud Storage files:


Copy the sample images into your bucket

Next, copy the flower dataset used in this Tensorflow blog post. The images are stored in a public Cloud Storage bucket, so you can copy them directly from there to your own bucket.

  1. In your Cloud Shell session, enter:

    gsutil -m cp -R gs://cloud-ml-data/img/flower_photos/ gs://${BUCKET}/img/

    The file copying takes about 20 minutes to complete.

Create the CSV file

The sample dataset contains a CSV file with all of the image locations and the labels for each image. You'll use that to create your own CSV file:

  1. Update the CSV file to point to the files in your own bucket:

    gsutil cat gs://${BUCKET}/img/flower_photos/all_data.csv | sed "s:cloud-ml-data:${BUCKET}:" > all_data.csv
  2. Copy the CSV file into your bucket:

    gsutil cp all_data.csv gs://${BUCKET}/csv/

Create your dataset

Visit the AutoML Vision UI to begin the process of creating your dataset and training your model.

When prompted, make sure to select the project that you used for your Cloud Storage bucket.

  1. From the AutoML Vision page, click New Dataset:

    New dataset button in console

  2. Specify a name for this dataset. Click the + sign to continue.

    New dataset name field

  3. Specify the Cloud Storage URI of your CSV file. For this quickstart, the CSV file is at gs://your-project-123-vcm/csv/all_data.csv. Make sure to replace your-project-123 with your specific project ID.

  4. Click Create Dataset. The import process takes a few minutes. When it completes, you are taken to the next page which has details on all of the images identified for your dataset, both labeled and unlabeled images. You can filter images by label by selecting a label under Filter labels. If you are using the flower dataset, you will see a warning alert which will notify you of repeated images or images with multiple labels (if multi-label is not enabled).

    Filtering by label example

    • You can add additional images and update labels for new and existing images after you have imported a CSV file.

Train your model

  1. Once your dataset has been created and processed, select the Train tab to initiate model training.

    select train tab

  2. Select TRAIN NEW MODEL to continue.

    Train new model option

    This will open a pop-up window with training options.

  3. From the training pop-up window, select 1. "Edge" from the Model type. Then select model optimized for 2. "Best trade-off" and your 3. node hour budget.

    Train Edge model

  4. Select "Start training" to begin model training.

    Training is initiated for your model, and should take about an hour. The training might stop earlier than the node hour you selected. The service will email you once training has completed, or if any errors occur.

Once training is complete, you can refer to evaluation metrics, as well as test and use the model.

Select the Evaluate tab to get more details on F1, Precision, and Recall scores.

Select an individual label under Filter labels to get details on true positives, false negatives and false positives.

Make a Prediction

Select the Predict tab for instructions on sending an image to your model for a prediction. You can also refer to Annotating images for examples.

Export and Deploy the Edge model

The final step in using an AutoML Vision Edge model is to export (optimize and download) and deploy (use) your model.

There are multiple ways you can export and deploy your models to use for prediction on Edge devices.

In this quickstart you will use Tensorflow Lite (TF Lite) as an example. TF Lite models are both easy to use and have a wide set of use cases.

  1. Under Use your Edge model, select the TFLite tab.

    Export TF Lite model

  2. Select Export to export a TF Lite package into your Cloud Storage storage bucket. The export process typically takes several minutes.

  3. After the export is completed, select the Cloud Storage link, and it will direct you to the Google Cloud Storage destination folder in the Google Cloud Platform Console.

In the Google Cloud Storage destination location you will find a folder named with timestamp and model format, under which you can find the following files:

  • a tflite file (model.tflite),
  • a dictionary file (dict.txt)
  • a metadata file (tflite_metadata.json)

What's Next

With these files, you can follow tutorials to deploy on Android devices, iOS devices, Raspberry Pi 3, or the Web.

Other model use options

  • You can also export the model as TensorFlow SavedModel and use it with a Docker container in the Container tab. See the container tutorial on how to export to a container.
  • You can export the model for running on Edge TPU in the Edge devices tab. Then follow Coral's official documentation about how to run an inference on the Edge TPU.
  • You can check check_box Format model for Core ML (iOS / macOS) before training the model for training a CoreML supported model. After training, you can export the model in CoreML tab, and follow the CoreML tutorial.