Make predictions with imported TensorFlow models
This page shows you how to import TensorFlow models into a BigQuery ML dataset and use them to make predictions from a SQL query. You can import TensorFlow models using these interfaces:
- The Google Cloud console
- The
bq query
command in the bq command-line tool - The BigQuery API
For more information about importing TensorFlow models into
BigQuery ML, including format and storage requirements, see The CREATE
MODEL
statement for importing TensorFlow models.
Import TensorFlow models
To import TensorFlow models into a dataset, follow these steps:
Console
In the Google Cloud console, go to the BigQuery page.
In the query editor, enter a
CREATE MODEL
statement like the following.CREATE OR REPLACE MODEL `example_dataset.imported_tf_model` OPTIONS (MODEL_TYPE='TENSORFLOW', MODEL_PATH='gs://cloud-training-demos/txtclass/export/exporter/1549825580/*')
The preceding query imports a model located at
gs://cloud-training-demos/txtclass/export/exporter/1549825580/*
as a BigQuery ML model namedimported_tf_model
. The Cloud Storage URI ends in a wildcard character (*
) so that BigQuery ML also imports any assets associated with the model. The imported model is a TensorFlow text classifier model that predicts which website published a given article title.Your new model should now appear in the Resources panel. As you expand each of the datasets in a project, models are listed along with the other BigQuery resources in the datasets. Models are indicated by the model icon: .
If you select the new model in the Resources panel, information about the model appears below the Query editor.
bq
To import a TensorFlow model from Cloud Storage, run a batch query by entering a command like the following:
bq query \
--use_legacy_sql=false \
"CREATE MODEL
`mydataset.mymodel`
OPTIONS
(MODEL_TYPE='TENSORFLOW',
MODEL_PATH='gs://bucket/path/to/saved_model/*')"
For example:
bq query --use_legacy_sql=false \
"CREATE OR REPLACE MODEL
`example_dataset.imported_tf_model`
OPTIONS
(MODEL_TYPE='TENSORFLOW',
MODEL_PATH='gs://cloud-training-demos/txtclass/export/exporter/1549825580/*')"
After importing the model, it should appear in the output of bq ls [dataset_name]
:
$ bq ls example_dataset
tableId Type Labels Time Partitioning
------------------- ------- -------- -------------------
imported_tf_model MODEL
API
Insert a new job and populate the jobs#configuration.query property as in the following request body:
{
"query": "CREATE MODEL `project_id:mydataset.mymodel` OPTIONS(MODEL_TYPE='TENSORFLOW' MODEL_PATH='gs://bucket/path/to/saved_model/*')"
}
BigQuery DataFrames
Before trying this sample, follow the BigQuery DataFrames setup instructions in the BigQuery quickstart using BigQuery DataFrames. For more information, see the BigQuery DataFrames reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up ADC for a local development environment.
Import the model by using the TensorFlowModel
object.
Make predictions with imported TensorFlow models
To make predictions with imported TensorFlow models, follow these steps. The following examples assume you've imported the TensorFlow model as you did in the preceding example.
Console
In the Google Cloud console, go to the BigQuery page.
In the query editor, enter a query using
ML.PREDICT
like the following.SELECT * FROM ML.PREDICT(MODEL
example_dataset.imported_tf_model
, ( SELECT title AS input FROMbigquery-public-data.hacker_news.full
) )The preceding query uses the model named
imported_tf_model
in the datasetexample_dataset
in the current project to make predictions from input data in the public tablefull
from the datasethacker_news
in the projectbigquery-public-data
. In this case, the TensorFlow model'sserving_input_fn
function specifies that the model expects a single input string namedinput
, so the subquery assigns the aliasinput
to the column in the subquery'sSELECT
statement.This query outputs results like the following. In this example, the model outputs the column
dense_1
, which contains an array of probability values, as well as aninput
column, which contains the corresponding string values from the input table. Each array element value represents the probability that the corresponding input string is an article title from a particular publication.
bq
To make predictions from input data in the table input_data
, enter a
command like the following, using the imported TensorFlow model
my_model
:
bq query \
--use_legacy_sql=false \
'SELECT *
FROM ML.PREDICT(
MODEL `my_project.my_dataset.my_model`,
(SELECT * FROM input_data))'
For example:
bq query \
--use_legacy_sql=false \
'SELECT *
FROM ML.PREDICT(
MODEL `tensorflow_sample.imported_tf_model`,
(SELECT title AS input FROM `bigquery-public-data.hacker_news.full`))'
This example returns results like the following:
+------------------------------------------------------------------------+----------------------------------------------------------------------------------+ | dense_1 | input | +------------------------------------------------------------------------+----------------------------------------------------------------------------------+ | ["0.6251608729362488","0.2989124357700348","0.07592673599720001"] | How Red Hat Decides Which Open Source Companies t... | | ["0.014276246540248394","0.972910463809967","0.01281337533146143"] | Ask HN: Toronto/GTA mastermind around side income for big corp. dev? | | ["0.9821603298187256","1.8601855117594823E-5","0.01782100833952427"] | Ask HN: What are good resources on strategy and decision making for your career? | | ["0.8611106276512146","0.06648492068052292","0.07240450382232666"] | Forget about promises, use harvests | +------------------------------------------------------------------------+----------------------------------------------------------------------------------+
API
Insert a new job and populate the jobs#configuration.query property as in the following request body:
{
"query": "SELECT * FROM ML.PREDICT(MODEL `my_project.my_dataset.my_model`, (SELECT * FROM input_data))"
}
BigQuery DataFrames
Before trying this sample, follow the BigQuery DataFrames setup instructions in the BigQuery quickstart using BigQuery DataFrames. For more information, see the BigQuery DataFrames reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up ADC for a local development environment.
Use the predict
function to run the remote model:
The result is similar to the following:
What's next
- For more information about importing TensorFlow models, see The
CREATE MODEL
statement for importing TensorFlow models. - For an overview of BigQuery ML, see Introduction to BigQuery ML.
- To get started using BigQuery ML, see Create machine learning models in BigQuery ML.
- For more information about working with models, see these resources: