Edge device model quickstart

This quickstart walks you through the process of:

  • Copying a set of images into Google Cloud Storage.
  • Creating a CSV listing the images and their labels.
  • Using AutoML Vision to create your dataset, train a custom AutoML Vision Edge model, and make a prediction.
  • Exporting and deploying your AutoML Vision Edge model.

Before you begin

Set up your project

  1. Sign in to your Google Account.

    If you don't already have one, sign up for a new account.

  2. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to the project selector page

  3. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  4. Enable the AutoML and Cloud Storage APIs.

    Enable the APIs

  5. Install the gcloud command line tool.
  6. Follow the instructions to create a service account and download a key file for that account.
  7. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path to the service account key file that you downloaded when you created the service account.
    export GOOGLE_APPLICATION_CREDENTIALS=key-file
  8. Set the PROJECT_ID environment variable to your Project ID.
    export PROJECT_ID=your-project-id
    The AutoML API calls and resource names include your Project ID in them. The PROJECT_ID environment variable provides a convenient way to specify the ID.
  9. If you are an owner for your project, add your service account to the AutoML Editor IAM role, replacing service-account-name with the name of your new service account. For example, service-account1@myproject.iam.gserviceaccount.com.
    gcloud auth login
    gcloud projects add-iam-policy-binding $PROJECT_ID \
       --member="serviceAccount:service-account-name" \
       --role="roles/automl.editor"
    
  10. Otherwise (if you are not a project owner), ask a project owner to add both your user ID and your service account to the AutoML Editor IAM role.

Create a Cloud Storage bucket

Use Cloud Shell, a browser-based Linux command line connected to your Cloud Console project, to create your Cloud Storage bucket:

  1. Open Cloud Shell.

  2. Create a Google Cloud Storage bucket. The bucket name must be in the format: project-id-vcm. The following command creates a storage bucket in the us-central1 region named project-id-vcm. For a complete list of available regions, see the Bucket Locations page.

    gsutil mb -p ${PROJECT_ID} -c regional -l us-central1 gs://${PROJECT_ID}-vcm/

  3. Set the BUCKET variable.

    export BUCKET=${PROJECT_ID}-vcm

Copy the sample images into your bucket

Next, copy the flower dataset used in this Tensorflow blog post. The images are stored in a public Cloud Storage bucket, so you can copy them directly from there to your own bucket.

  1. In your Cloud Shell session, enter:

    gsutil -m cp -R gs://cloud-ml-data/img/flower_photos/ gs://${BUCKET}/img/
    

    The file copying takes about 20 minutes to complete.

Create the CSV file

The sample dataset contains a CSV file with all of the image locations and the labels for each image. You'll use that to create your own CSV file:

  1. Update the CSV file to point to the files in your own bucket:

    gsutil cat gs://${BUCKET}/img/flower_photos/all_data.csv | sed "s:cloud-ml-data:${BUCKET}:" > all_data.csv
    
  2. Copy the CSV file into your bucket:

    gsutil cp all_data.csv gs://${BUCKET}/csv/
    

Create your dataset

Visit the AutoML Vision UI to begin the process of creating your dataset and training your model.

When prompted, make sure to select the project that you used for your Cloud Storage bucket.

  1. From the AutoML Vision page, click New Dataset:

    New dataset button in console

  2. Specify a name for this dataset (optional). Select Create dataset to continue the dataset creation process.

    New dataset name field

  3. In the Select files to import screen choose the Select a CSV file on Cloud Storage radio option. Specify the Cloud Storage URI of your CSV file. For this quickstart, the CSV file is at:

    • gs://${PROJECT_ID}-vcm/csv/all_data.csv

    Replace PROJECT_ID with your specific project ID.

    Select file import window

  4. Click Continue to begin image import. The import process takes a few minutes. When it completes, you are taken to the next page which has details on all of the images identified for your dataset, both labeled and unlabeled images.

    Images listed after import finishes

Train your model

  1. Once your dataset has been created and processed, select the Train tab to initiate model training.

    select train tab

  2. Select Start training to continue. This will open a Train new model window with training options.

  3. In the Define your model section of the new model training window, change the model name (optional) and select the Edge model radio option. Select Continue to move to the following section.

    define your model section for training

  4. In the Optimize model for section accept the Best trade-off option and select Continue.

  5. In the Set a node hour budget section accept the suggested node budget (4 node hours).

    Train Edge model

  6. Select "Start training" to begin model training.

    Training is initiated for your model, and should take about an hour. The training might stop earlier than the node hour you selected. The service will email you once training has completed, or if any errors occur.

Deploy the model

Before you can export your model you must deploy it for use.

  1. To deploy your model, select the Test & use tab. In the tab click the Deploy model option near the model name.

  2. In the window that follows, specify 1 node to deploy on, and select Deploy to begin the model deployment process.

    choose node hours to deploy on

You will receive a notification when model deployment completes.

Export the model

The final step in using an AutoML Vision Edge model is to export (optimize and download) and deploy (use) your model.

There are multiple ways you can export and deploy your models to use for prediction on Edge devices.

In this quickstart you will use Tensorflow Lite (TF Lite) as an example. TF Lite models are both easy to use and have a wide set of use cases.

  1. Under Use your model section of the Test & use tab, select the TF Lite option.

    Export TF Lite model

  2. In the Export TF Lite package window that you are directed to specify a Cloud Storage bucket location to export a TF Lite package into, and select Export. The export process typically takes several minutes.

    Export TF Lite model side window

In the Google Cloud Storage destination location you will find a folder named with timestamp and model format, under which you can find the following files:

  • a tflite file (model.tflite),
  • a dictionary file (dict.txt)
  • a metadata file (tflite_metadata.json)

What's Next

With these files, you can follow tutorials to deploy on Android devices, iOS devices, Raspberry Pi 3, or the Web.

Other model use options

  • You can export the model as a CoreML (iOS/macOS) supported model. After training, you can export the model by selecting the CoreML option in the Test & use tab, and follow the CoreML tutorial.
  • You can export the model for running on Edge TPU. After training, you can export the model by selecting the Coral option in the Test & use tab. After exporting your model follow Coral's official documentation about how to run an inference on the Edge TPU.
  • You can export the model as TensorFlow SavedModel and use it with a Docker container. After training, you can export the model by selecting the Container option in the Test & use tab, and follow the Edge containers tutorial on how to export to a container.
  • You can export the model for use in a browser or in Node.js as a Tensorflow.js model. After training, you can export the model by selecting the Tensorflow.js option in the Test & use tab, and follow the Edge TensorFlow.js tutorial.

Cleanup

If you no longer need your custom model or dataset, you can delete them.

To avoid unnecessary Google Cloud Platform charges, use the GCP Console to delete your project if you do not need it.

Undeploy your model

Your model incurs charges while it is deployed.

  1. Select the Test & Use tab just below the title bar.
  2. Select Remove deployment from the banner beneath your model name to open the undeploy option window.

    undeploy popup menu

  3. Select Remove deployment to undeploy the model.

    model deploying

  4. You will receive an email when model undeployment has completed.

Delete your project (optional)

To avoid unnecessary Google Cloud Platform charges, use the Cloud Console to delete your project if you do not need it.