Jump to Content
Data Analytics

New this month: Data lakes, speed at scale, and SAP data

May 7, 2021
Sudhir Hasbe

Sr. Director, Product Management

As you’ve probably noticed by now, our team is all about our customers. Earlier in the year, the New York Times shared how their data analytics team went from staying up until three in the morning trying to keep their legacy system running to relaxing while eating ice cream after their migration over to Google Cloud. We also explored the details of why Verizon Media picked BigQuery for scale, performance and cost. And who could forget the awesome story of how the Golden State Warriors transform on-court data into competitive advantage, pulling raw data from AWS into Google Cloud for fast analytics.

Leaders in industry show us the way!

In April, we highlighted the best practices two incredible organizations are using to turn data into value at an incredible pace. Carrefour, a leading global retailer with over 12,000 stores in over 30 countries, published an outstanding set of best practices on their corporate blog describing how Google Cloud fueled the company’s digital transformation. 

Yann Barraud, the company’s Head of Data Platforms, revealed how they managed to migrate a 700TB data lake to Google Cloud in just a few months without any service interruption—and it’s already scaling again with more than 2TB of new data each day. In addition, the total cost of ownership is lower than before despite serving more than 80 applications and executing 100 million API calls per month.

You might also enjoy hearing how the team at Broadcom modernized their data lake with Dataproc, Cloud SQL and Bigtable, migrating around 80 applications with a data pipeline that receives telemetry data from millions of devices around the world. The move increased the company’s enterprise agility and translated to a reduction of 25% in monthly support calls.

Watch a quick interview below with the team that made it happen:

Video Thumbnail

Broadcom rethinks their cybersecurity data lake with Google Cloud

If you like hearing data analytics success stories, you should check out how online food delivery network Delivery Hero turned to Google BigQuery to improve data accessibility and sharing across 174 datasets and 2.7 petabytes of data.

And you’ll love reading this Forbes piece about how Zulily established a set of data-driven basics to guide them to business success. These principles help remind data science and engineering teams that ultimately technology is meant to serve customer needs. If it’s failing to do that—it’s time to question why you’ve got it. 

One of our final favorite stories from this past month kicked off with the opening day of Major League Baseball’s 2021 season. MLB’s data cloud does more than provide insights that increase viewership and sell jerseys—it’s about bringing fans a richer appreciation for the game with applications like their baseball metrics platform Statcast, which is built on Google Cloud.  

Statcast uses cameras to collect data on everything from pitch speed to ball trajectories to player poses. This data then gets fed into the Statcast data pipeline in real time and turned into on-screen analytics that announcers use as part of their in-game commentary. 

Want to get a taste for what that looks like? Check out the video below:

Video Thumbnail

Funny baseball moments of 2020 (Statcast style!)

And that’s just a few of the many incredible journeys we witness every month. 

Join us on May 26th, 2021 for the Data Cloud Summit to hear more about how leading companies like Equifax, PayPal, Rackspace, Keybank, Deutsche Bank, and many more are using Google Cloud to transform their organizations. You’ll also hear the latest updates (and a few surprises) from our data management, data analytics, and business intelligence product teams about where we’re headed in the future. Be sure to save your seat for free now! 

The need for speed, intelligence, and engagement

In case you missed it, we also had a great webinar with FaceIT last month. As the world’s biggest independent competitive gaming platform, FaceIT has more than 18 million users that compete in over 20 million game sessions each month.  During the webinar, Director of Data & Analytics Maria Laura Scuri talked with us about how her team leveraged BigQuery BI Engine to create better gaming experiences.  

Here are the main takeaways from our conversation, along with some of the latest innovations from Google Cloud and Looker that customers are using to build better data experiences:

  • Speed is key for succeeding with data. High throughput is critical when it comes to streaming data in real time. We introduced a new streaming API for ingesting data into BigQuery. The BigQuery Storage Write API not only includes stream-level transactions and automatic schema update detection but it also comes with a very cost-effective pricing model of $0.025 per GB with the first 2 TB per month free.
  • Engagement drives rich customer experiences. According to the Mobile Gaming Analysis in 2019, most mobile games only see a 25% retention rate for users after the first day. Machine learning is a game changer for understanding the likelihood of specific users returning to applications or websites. This developer tutorial takes you through how to run propensity models for churn prediction using a BigQuery ML, Firebase, and Google Analytics.
  • Intelligent data services deliver new avenues for enriching data experiences. Enabling business users to easily transform data based on their needs not only reduces load on IT teams, it puts powerful insights right where they need to be to deliver the most value. Our newest solution uses Google Cloud Dataprep to help teams enrich survey data, find new insights, and visualize results with Looker, Data Studio, or another BI tool. BigQuery Pushdown for Trifacta data prep flows allows teams to use intelligent technology to execute transforms natively inside BigQuery, yielding up to 20X faster job executions and significant cost savings. 

Another exciting announcement from April was our new support for choropleth maps of BigQuery GEOGRAPHY polygons. Now, you can use Data Studio to visualize BigQuery GIS data in a Google Maps-based interface. 

You can play with it today for free using our BigQuery free trial and any of our public datasets.   This quick tutorial will show you how to visualize the affordability of rental properties in Washington state on a map. Give it a spin and let us know what you think!

More for your SAP data

We know that many of you want to do more with SAP data. That’s why we created the SAP Table Batch Source for Cloud Data Fusion, our fully managed, cloud-native data integration service. This new capability allows you to seamlessly integrate data from SAP Business Suite, SAP ERP and S/4HANA with the Google data platform, including  BigQuery, Cloud SQL, and Spanner. With the SAP Table Batch Source, you can leverage best-in-class machine learning capabilities and combine SAP data with other datasets. 

One awesome example is running machine learning on IoT data joined with ERP transactional data to do predictive maintenance, run application to application integration with SAP and Cloud SQL-based applications, fraud detection, spend analytics, demand forecasting, and more.

For more details about the benefits of the SAP Table Batch Source in Cloud Data Fusion, I highly recommend reading the introduction blog post

At Google Cloud, we’re always striving to enable you to do more with data, regardless of where the data is stored and how you’d like to visualize it. And expect more to come in the future—our work is far from done. 

If you want to hear more about what’s coming next, don’t forget to join us on May 26th, 2021 for the Data Cloud Summit to hear from leading companies about how Google Cloud is helping transform their organizations. I hope to see you there!

Posted in