You can explore BigQuery query results by using Colab Enterprise notebooks in BigQuery.
In this tutorial, you query data from a BigQuery public dataset and explore the query results in a notebook.
Objectives
- Create and run a query in BigQuery.
- Explore query results in a notebook.
Costs
This tutorial uses a dataset available through the Google Cloud Public Datasets Program. Google pays for the storage of these datasets and provides public access to the data. You incur charges for the queries that you perform on the data. For more information, see BigQuery pricing.
Before you begin
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the BigQuery API.
For new projects, BigQuery is automatically enabled.
Enable BigQuery Studio
Follow the instructions at Enable BigQuery Studio for asset management to save, share, and manage versions of code assets such as notebooks.
Required permissions
To create and run notebooks, you need the following Identity and Access Management (IAM) roles:
- BigQuery User (
roles/bigquery.user
) - Notebook Runtime User (
roles/aiplatform.notebookRuntimeUser
) - Code Creator (
roles/dataform.codeCreator
)
Open query results in a notebook
You can run a SQL query and then use a notebook to explore the data. This approach is useful if you want to modify the data in BigQuery before working with it, or if you need only a subset of the fields in the table.
In the Google Cloud console, go to the BigQuery page.
In the Type to search field, enter
bigquery-public-data
.If the project is not shown, enter
bigquery
in the search field, and then click Search to all projects to match the search string with the existing projects.Select bigquery-public-data > ml_datasets > penguins.
For the penguins table, click
View actions, and then click Query.Add an asterisk (
*
) for field selection to the generated query, so that it reads like the following example:SELECT * FROM `bigquery-public-data.ml_datasets.penguins` LIMIT 1000;
Click
Run.In the Query results section, click Explore data, and then click Explore with Python notebook.
Prepare the notebook for use
Prepare the notebook for use by connecting to a runtime and setting application default values.
- In the notebook header, click Connect to connect to the default runtime.
- In the Setup code block, click Run cell.
Explore the data
- To load the penguins data into a BigQuery DataFrame and show the results, click Run cell in the code block in the Result set loaded from BigQuery job as a DataFrame section.
- To get descriptive metrics for the data, click Run cell in the code block in the Show descriptive statistics using describe() section.
- Optional: Use other Python functions or packages to explore and analyze the data.
The following code sample shows using
bigframes.pandas
to analyze data, and bigframes.ml
to create a linear regression model from penguins data in a
BigQuery DataFrame:
Clean up
To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.
The easiest way to eliminate billing is to delete the Google Cloud project that you created for this tutorial.
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.
What's next
- Learn more about creating notebooks in BigQuery.
- Learn more about exploring data with BigQuery DataFrames.