Making predictions with imported TensorFlow models

Overview

This page shows you how to import TensorFlow models into a BigQuery ML dataset and use them to make predictions from a SQL query. You can import TensorFlow models using these interfaces:

For more information about importing TensorFlow models into BigQuery ML, including format and storage requirements, see The CREATE MODEL statement for importing TensorFlow models.

Importing TensorFlow models

To import TensorFlow models into a dataset, follow the following steps:

Console

  1. Go to the BigQuery web UI in the GCP Console.

    Go to the BigQuery web UI

  2. In the query editor, enter a CREATE MODEL statement like the following.

      CREATE OR REPLACE MODEL example_dataset.imported_tf_model
       OPTIONS (MODEL_TYPE='TENSORFLOW',
        MODEL_PATH='gs://cloud-training-demos/txtclass/export/exporter/1549825580/*')
    

    The above query imports a model located at gs://cloud-training-demos/txtclass/export/exporter/1549825580/* as a BigQuery ML model named imported_tf_model. The Cloud Storage URI ends in a wildcard character (*) so that BigQuery ML also imports any assets associated with the model. The imported model is a TensorFlow text classifier model that predicts which website published a given article title.

  3. Your new model should now appear in the Resources panel. As you expand each of the datasets in a project, models are listed along with the other BigQuery resources in the datasets. Models are indicated by the model icon: model icon .

  4. If you select the new model in the Resources panel, information about the model will appear below the Query editor.

    TensorFlow model info

CLI

Enter a command like the following to run a batch query to import a TensorFlow model from Cloud Storage:

bq query \
--use_legacy_sql=false \
"CREATE MODEL
  mydataset.mymodel
OPTIONS
  (MODEL_TYPE='TENSORFLOW',
   MODEL_PATH='gs://bucket/path/to/saved_model/*')"

For example:

bq query --use_legacy_sql=false \
"CREATE OR REPLACE MODEL
  example_dataset.imported_tf_model
OPTIONS
  (MODEL_TYPE='TENSORFLOW',
    MODEL_PATH='gs://cloud-training-demos/txtclass/export/exporter/1549825580/*')"

After importing the model, it should appear in the output of bq ls [dataset_name]:

$ bq ls example_dataset

       tableId        Type    Labels   Time Partitioning
 ------------------- ------- -------- -------------------
  imported_tf_model   MODEL

API

Insert a new job and populate the jobs#configuration.query property as in the following request body:

{
  "query": "CREATE MODEL project_id:mydataset.mymodel OPTIONS(MODEL_TYPE='TENSORFLOW' MODEL_PATH='gs://bucket/path/to/saved_model/*')"
}

Making predictions with imported TensorFlow models

To make predictions with imported TensorFlow models, follow the following steps. The examples assume you have imported the TensorFlow model as in the example above.

Console

  1. Go to the BigQuery web UI in the GCP Console.

    Go to the BigQuery web UI

  2. In the query editor, enter a query using ML.PREDICT like the following.

     SELECT *
       FROM ML.PREDICT(MODEL example_dataset.imported_tf_model,
         (
          SELECT title AS input
          FROM `bigquery-public-data.hacker_news.stories`
         )
     )
    

    The above query uses the model named imported_tf_model in the dataset example_dataset in the current project to make predictions from input data in the public table stories from the dataset hacker_news in the project bigquery-public-data. In this case, the TensorFlow model's serving_input_fn function specifies that the model expects a single input string named "input," so the subquery asigns the alias input to the column in the subquery's SELECT statement.

    This query outputs results like the following. In this example, the model outputs the column dense_1, which contains an array of probability values, as well as an input column, which contains the corresponding string values from the input table. Each array element value represents the probability that the corresponding input string is an article title from a particular publication.

    Query results

CLI

Enter a command like the following to make predictions from input data in the table input_data using the imported TensorFlow model my_model:

bq query \
--use_legacy_sql=false \
'SELECT *
 FROM ML.PREDICT(
   MODEL `my_project`.my_dataset.my_model,
   (SELECT * FROM input_data))'

For example:

bq query \
--use_legacy_sql=false \
'SELECT *
FROM ML.PREDICT(
  MODEL tensorflow_sample.imported_tf_model,
  (SELECT title AS input FROM `bigquery-public-data.hacker_news.stories`))'

This example returns results like the following:

+----------------------------------------------------------------------+------------------------------------------------------------------------------------+
|                               dense_1                                |                                       input                                        |
+----------------------------------------------------------------------+------------------------------------------------------------------------------------+
|   ["0.8611106276512146","0.06648492068052292","0.07240450382232666"] | Appshare                                                                           |
|    ["0.6251608729362488","0.2989124357700348","0.07592673599720001"] | A Handfull of Gold.                                                                |
|   ["0.014276246540248394","0.972910463809967","0.01281337533146143"] | Fastest Growing Skin Care Supplement for Increased Hair, Skin and Nail Nourishment |
| ["0.9821603298187256","1.8601855117594823E-5","0.01782100833952427"] | R4 3ds sdhc                                                                        |
|   ["0.8611106276512146","0.06648492068052292","0.07240450382232666"] | Empréstimo Com Nome Sujo                                                           |
+----------------------------------------------------------------------+------------------------------------------------------------------------------------+

API

Insert a new job and populate the jobs#configuration.query property as in the following request body:

{
  "query": "SELECT * FROM ML.PREDICT(MODEL `my_project`.my_dataset.my_model, (SELECT * FROM input_data))"
}

What's next

Was this page helpful? Let us know how we did:

Send feedback about...

BigQuery ML Documentation