Tutorial: Run inference on an object table by using a classification model
This tutorial shows you how to create an object table based on the images from a public dataset, and then run inference on that object table using the ResNet 50 model.
The ResNet 50 model
The ResNet 50 model analyzes image files and outputs a batch of vectors representing the likelihood that an image belongs the corresponding class (logits). For more information, see the Usage section on the model's TensorFlow Hub page.
The ResNet 50 model input takes a tensor of
DType =
float32 in the shape [-1, 224, 224, 3]. The output is an array of
tensors of tf.float32 in the shape[-1, 1024].
Required permissions
- To create the dataset, you need the
bigquery.datasets.createpermission. To create the connection resource, you need the following permissions:
bigquery.connections.createbigquery.connections.get
To grant permissions to the connection's service account, you need the following permission:
resourcemanager.projects.setIamPolicy
To create the object table, you need the following permissions:
bigquery.tables.createbigquery.tables.updatebigquery.connections.delegate
To create the bucket, you need the
storage.buckets.createpermission.To upload the model to Cloud Storage, you need the
storage.objects.createandstorage.objects.getpermissions.To load the model into BigQuery ML, you need the following permissions:
bigquery.jobs.createbigquery.models.createbigquery.models.getDatabigquery.models.updateData
To run inference, you need the following permissions:
bigquery.tables.getDataon the object tablebigquery.models.getDataon the modelbigquery.jobs.create
Costs
In this document, you use the following billable components of Google Cloud:
- BigQuery: You incur storage costs for the object table you create in BigQuery.
- BigQuery ML: You incur costs for the model you create and the inference you perform in BigQuery ML.
- Cloud Storage: You incur costs for the objects you store in Cloud Storage.
To generate a cost estimate based on your projected usage,
use the pricing calculator.
For more information on BigQuery storage pricing, see Storage pricing in the BigQuery documentation.
For more information on BigQuery ML pricing, see BigQuery ML pricing in the BigQuery documentation.
For more information on Cloud Storage pricing, see the Cloud Storage pricing page.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
-
Create a project: To create a project, you need the Project Creator
(
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission. Learn how to grant roles.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the BigQuery and BigQuery Connection API APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission. Learn how to grant roles. -
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
-
Create a project: To create a project, you need the Project Creator
(
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission. Learn how to grant roles.
-
Verify that billing is enabled for your Google Cloud project.
-
Enable the BigQuery and BigQuery Connection API APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission. Learn how to grant roles.
Create a reservation
To use an
imported model
with an object table, you must
create a reservation
that uses the BigQuery
Enterprise or Enterprise Plus edition,
and then
create a reservation assignment
that uses the QUERY job type.
Create a dataset
Create a dataset named resnet_inference_test:
SQL
Go to the BigQuery page.
In the Editor pane, run the following SQL statement:
CREATE SCHEMA `PROJECT_ID.resnet_inference_test`;
Replace
PROJECT_IDwith your project ID.
bq
In the Google Cloud console, activate Cloud Shell.
Run the
bq mkcommand to create the dataset:bq mk --dataset --location=us PROJECT_ID:resnet_inference_test
Replace
PROJECT_IDwith your project ID.
Create a connection
Create a connection named lake-connection:
Console
Go to the BigQuery page.
In the left pane, click Explorer:

If you don't see the left pane, click Expand left pane to open the pane.
In the Explorer pane, click Add data.
The Add data dialog opens.
In the Filter By pane, in the Data Source Type section, select Databases.
Alternatively, in the Search for data sources field, you can enter
Vertex AI.In the Featured data sources section, click Vertex AI.
Click the Vertex AI Models: BigQuery Federation solution card.
In the Connection type list, select Vertex AI remote models, remote functions, BigLake and Spanner (Cloud Resource).
In the Connection ID field, type
lake-connection.Click Create connection.
In the Connection info pane, copy the value from the Service account id field and save it somewhere. You need this information to grant permissions to the connection's service account.
bq
In Cloud Shell, run the
bq mkcommand to create the connection:bq mk --connection --location=us --connection_type=CLOUD_RESOURCE \ lake-connectionRun the
bq showcommand to retrieve information about the connection:bq show --connection us.lake-connectionFrom the
propertiescolumn, copy the value of theserviceAccountIdproperty and save it somewhere. You need this information to grant permissions to the connection's service account.
Create a Cloud Storage bucket
Create a Cloud Storage bucket to contain the model files.
Grant permissions to the connection's service account
Console
Go to the IAM & Admin page.
Click Grant Access.
The Add principals dialog opens.
In the New principals field, enter the service account ID that you copied earlier.
In the Select a role field, select Cloud Storage, and then select Storage Object Viewer.
Click Save.
gcloud
In Cloud Shell, run the
gcloud storage buckets add-iam-policy-binding command:
gcloud storage buckets add-iam-policy-binding gs://BUCKET_NAME \ --member=serviceAccount:MEMBER \ --role=roles/storage.objectViewer
Replace MEMBER with the service account ID that you
copied earlier. Replace BUCKET_NAME with the name
of the bucket you previously created.
For more information, see Add a principal to a bucket-level policy.
Create an object table
Create an object table named vision_images based on the
image files in the public gs://cloud-samples-data/vision bucket:
SQL
Go to the BigQuery page.
In the Editor pane, run the following SQL statement:
CREATE EXTERNAL TABLE resnet_inference_test.vision_images WITH CONNECTION `us.lake-connection` OPTIONS( object_metadata = 'SIMPLE', uris = ['gs://cloud-samples-data/vision/*.jpg'] );
bq
In Cloud Shell, run the
bq mk command
to create the connection:
bq mk --table \
--external_table_definition='gs://cloud-samples-data/vision/*.jpg@us.lake-connection' \
--object_metadata=SIMPLE \
resnet_inference_test.vision_images
Upload the model to Cloud Storage
Get the model files and make them available in Cloud Storage:
- Download
the ResNet 50 model to your local machine. This gives you a
saved_model.pbfile and avariablesfolder for the model. - Upload the
saved_model.pbfile and thevariablesfolder to the bucket you previously created.
Load the model into BigQuery ML
Go to the BigQuery page.
In the Editor pane, run the following SQL statement:
CREATE MODEL `resnet_inference_test.resnet` OPTIONS( model_type = 'TENSORFLOW', model_path = 'gs://BUCKET_NAME/*');
Replace
BUCKET_NAMEwith the name of the bucket you previously created.
Inspect the model
Inspect the uploaded model to see what its input and output fields are:
Go to the BigQuery page.
In the left pane, click Explorer:

In the Explorer pane, expand your project, click Datasets, and then click the
resnet_inference_testdataset.Go to the Models tab.
Click the
resnetmodel.In the model pane that opens, click the Schema tab.
Look at the Labels section. This identifies the fields that are output by the model. In this case, the field name value is
activation_49.Look at the Features section. This identifies the fields that must be input into the model. You reference them in the
SELECTstatement for theML.DECODE_IMAGEfunction. In this case, the field name value isinput_1.
Run inference
Run inference on the vision_images object table using the resnet model:
Go to the BigQuery page.
In the Editor pane, run the following SQL statement:
SELECT * FROM ML.PREDICT( MODEL `resnet_inference_test.resnet`, (SELECT uri, ML.RESIZE_IMAGE(ML.DECODE_IMAGE(data), 224, 224, FALSE) AS input_1 FROM resnet_inference_test.vision_images) );
The results should look similar to the following:
------------------------------------------------------------------------------------------------------------------------------------- | activation_49 | uri | input_1 | —------------------------------------------------------------------------------------------------------------------------------------ | 1.0254175464297077e-07 | gs://cloud-samples-data/vision/automl_classification/flowers/daisy/21652746_cc379e0eea_m.jpg | 0.0 | —------------------------------------------------------------------------------------------------------------------------------------ | 2.1671139620593749e-06 | | 0.0 | —-------------------------- ----------- | 8.346052027263795e-08 | | 0.0 | —-------------------------- ----------- | 1.159310958342985e-08 | | 0.0 | —------------------------------------------------------------------------------------------------------------------------------------
Clean up
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.