Jump to Content
Data Analytics

What’s happening in BigQuery: New features bring flexibility and scale to your data warehouse

October 29, 2019
Evan Jones

Technical Curriculum Developer, Google Cloud

We’ve been busy releasing new features for BigQuery, Google Cloud’s petabyte-scale data warehouse. BigQuery lets you ingest and analyze data quickly and with high availability, so you can find new insights, trends, and predictions to efficiently run your business. Our Google Cloud engineering team is continually making improvements to BigQuery to accelerate your time to value. 

Recently added BigQuery features include scripting and stored procedures, multiple queries in a single statement, simplifying the process of migrating Teradata to BigQuery, and BI Engine reservation increases. These updates can bring even more flexibility, simplicity, and choice to your data warehouse.

Read on to learn more about these new capabilities and get quick demos and tutorial links so you can try these features yourself.

BigQuery scripting and stored procedures are now in beta

Now available in beta, BigQuery scripting and stored procedures execute multiple statements in one request, letting data engineers and analysts efficiently complete complex tasks within BigQuery. This new feature allows you to build sophisticated workflows using control logic such as IF...THEN, and WHILE, and save these tasks as stored procedures within BigQuery so that they can be shared with other users, who can then invoke these procedures with a CALL command. 

You can also use variables to apply parameters to customize execution of the script, as well as use temporary tables to stage data at the session level. BigQuery users who are familiar with procedural languages will find it easy to use these new capabilities to accelerate data migration or execute complex ETL logic—as well as save time with logic reuse.

Loading...

To learn more, check out our documentation

Run multiple statements in one query

As part of the release of scripting and stored procedures, you may have noticed that the previous code example has multiple statements terminating with semicolons within the same script. BigQuery now supports running multiple statements with one query. They are processed from top to bottom in order.  

Let’s look at a few quick examples with basic SQL.

Demo: Creating and reading from a table

A popular use case is creating multiple tables as part of a BI pipeline. When you’re developing a SQL pipeline, you can test it all in a single script here. [code]

https://storage.googleapis.com/gweb-cloudblog-publish/original_images/bq-sept-multi-statement.gif

Demo: Setting up a simple SQL health check 

We can extend this example by simulating an outage in an upstream data table (as if all records were accidentally removed). One of the ways you can guard against reporting on a faulty upstream data source is by adding health checks using the BigQuery ERROR() function. [code]

https://storage.googleapis.com/gweb-cloudblog-publish/original_images/bq-sept-2.gif

As you saw in the quick demo above, you can now set up multiple statements in one query. They will execute in series. Our quick example showed how to inspect upstream data against known business rules and prevent it from corrupting the data warehouse. 

Make the Teradata to BigQuery transition easier

We recently announced the general availability of the Teradata to BigQuery migration service. The combination of the BigQuery Data Transfer Service and a special migration agent allows you to copy your data from a Teradata on-premises data warehouse instance to BigQuery. You can use the BigQuery Data Transfer Service to migrate data as well as schema from Teradata to BigQuery. The data transfer is free of charge. Check out the pricing page for more details.

With this launch, on-demand, one-time transfers are now generally available. Incremental, recurring transfers are in beta. To learn more, check out the Teradata migration documentation. And take a look at our recent post announcing the upcoming Teradata/Google Cloud partnership.

Transfer modes and options

Because every migration has unique requirements, you can customize the migration agent in a few different ways. There are three major choices when setting up a data transfer from Teradata to BigQuery:

Example migration architecture diagram

https://storage.googleapis.com/gweb-cloudblog-publish/images/Example_migration_architecture_diagram.max-2000x2000.png

Teradata to BigQuery documentation

BigQuery’s BI Engine now supports higher reservation sizes

The maximum size of a BI Engine reservation has increased from 10 GB to 50 GB per project per location, giving you a lot more in-memory storage in BI Engine for faster reporting and dashboarding needs. This limit does not affect the size of the tables that you query. BI Engine loads in-memory only for the columns used in your queries, and not the entire table. Additionally, data stored inside BI Engine is compressed.

The data model size remains limited to 10 GB per table. If you have a 50 GB reservation per project per location, BI Engine will limit the reservation per table to 10 GB. The rest of the available reservation will be used for other tables in the project.

In case you missed it

For more on all things BigQuery, check out these recent posts, videos and how-tos:

To keep up on what’s new with BigQuery, subscribe to our release notes and stay tuned to the blog for news and announcements And let us know how else we can help.

Posted in