Jump to Content
Transform with Google Cloud

Three pillars of generative AI adoption in 2024, from Google Cloud's CTO

December 19, 2023
Will Grannis

VP and CTO, Google Cloud

Organizations should focus on sustainable costs, broad access, and trust and security to get gen AI right in the new year.

Try Gemini 1.5 Pro

Google's most advanced multimodal model is now widely available.

Got AI?

For organizations everywhere, the AI excitement of 2023 is quickly giving way to something more interesting and useful: AI-enabled business results that matter. The reasons include recent advances in AI capabilities across the computing stack that allow organizations in any industry, geography, or phase of growth to access potential normally reserved for the few. With a lot of people working with these new capabilities, expect a lot of innovation and results.

Families of models like Google's Gemini are the strongest expressions yet of generative AI's initial breakthrough, enabling people and devices to interact in natural human language. Computers guided by human prompting synthesize unimaginable amounts of data to digest information, make predictions, assist with tasks, or create novel content, from text-to-images to new computer code. Gemini takes things further than ever as the world’s first native multimodal model.

Before, you needed separate models to make sense of text, audio, code, images, mathematics, or video. Gemini can handle all of these all at once, much like the way humans simultaneously read, speak, and observe the world around them as they collaborate.

After the "wow" moments in 2023, many enterprises face the question, "What does it mean for our business, and what does it cost?" As Google Cloud’s chief technology officer, my work puts me in a fortunate position to understand where the technology is going (the convergence of AI assistants, platforms, and infrastructure), and how some of the world's preeminent organizations are already leveraging it. Broadly, I see three key pillars that will impact how companies understand, deploy and use gen AI in 2024.

Economics and energy

The viability of gen AI in an enterprise often centers on key costs, in both financial and increasingly in environmental terms. Disciplined execution satisfies both the financial life of the business and the growing importance of adhering to regulation and corporate citizenship.

Gen AI uses immense computation, with cost and social challenges around energy use. Customers will require knowledge of how energy is managed for data centers and the flexibility to optimize production using the cleanest possible regions and zones. It will likely affect the practice of writing software and may employ carbon budgeting as part of the developer practice. Our customers want us to continue our significant sustainability efforts, and it’s a safe bet that sustainable gen AI will rise in demand and importance in 2024.

The large language models, or LLMs, that power gen AI require efficient training, fine tuning, inference, and life cycle management. Cost curves demand focused and principled execution, particularly as projects scale up. That's one reason why we've built an optimized AI infrastructure to power Vertex, our flagship AI platform.

Google incorporated AI into search in 2015. Experiencing this AI scale-out challenge first-hand — and knowing that historically, 50% or more of software costs are maintenance, including refinement — made efficiency an early priority for us. So we developed Tensor Processing Units (TPUs), which are specialized chips that handle AI workloads, including gen AI, at a sharply lower cost and better energy use. Being great stewards of scarce customer investment dollars and a finite global energy supply are non-negotiable priorities for all modern organizations.

Ubiquity and access

For many, the first experience with gen AI will be in products like a tool for transforming old databases into new and more powerful products, an assistant to help manage your working life, or a bot offering high-quality answers to medical questions. These all rest on a new computing paradigm that uses more data, from more sources, in more flexible ways. The information in hospital billing, for example, might be aggregated to spot national health trends or repurposed to track how long it takes to deliver services in different locations, spotting nursing shortages.

This kind of thing will be possible using the right foundation models and tools, even in organizations with limited staff and resources. As it becomes ambient and ubiquitous, gen AI won’t mean a model, it will mean a helpful, possibly magical, experience.

There is also the issue of making sure gen AI is accessible and useful to everyone in the marketplace, not just a few giants. Tools and platforms need to allow anyone to get started with AI, efficiently and responsibly, and these should be easy to find and surface. Some lines will blur, including moving more seamlessly between web-based experimentation environments to robust, platform-enabled environments with robust security and assurances.


Additionally, gen AI will have the effect of turning much software from a generic product into a product personalized to each corporate need and culture, even adapting to individual workers and customers. Grounding and tuning LLMs with proprietary corporate data allows the context and knowledge resident in a company to sharpen the performance of a model. The introduction of “parameter efficient fine tuning” techniques will make this tailoring much more realistic for a wider range of organizations.

We'll see rapid advances in distillation, ensembles and federation (all emerging ways to better sharpen model outputs) as well as new creator tools that will open development to a wider set of workers. Organizations in highly regulated industries, like finance and healthcare, are likely to take a more restrained approach than businesses like gaming and media.

Trust and security

Underpinning all of the gen AI disruption will be the fundamental human and organizational need for trust in responsible providers. The healthcare example above is an exciting idea, but it reinforces the need for pervasive data encryption and AI-enhanced security to access data across several locations at once, including different clouds and on-premise systems, and effective cost monitoring.

Our latest Gemini-based advances in productivity, threat detection, and response take gen AI to the forefront of enterprise security. If anything, this underlines the reality that threats are not going away and will probably acquire their own AI-powered capabilities. Security needs its own gen AI tools as well, capable of spotting and explaining threats in a whole new way. Our domain-specific language model, Sec-Palm 2, is trained on a strong range of security use cases, capable of instantly recognizing potentially malicious scripts and alerting teams to active threats.

Just as an individual decides whether to trust what they see, hear, and read based on a comparison to what they've experienced before, so will organizations start to index what they know. They can then make knowledge and data more accessible and useful in the creation of experiences, efficiencies, and differentiation that acts as a trusted extension of their hard-won credibility.

Getting started in 2024

We are now at an extraordinary new level of human-computer interaction. It is getting stronger even as it gets easier to use, both for individual developers and corporations. Far from ending jobs, we believe it will place new demands on human creativity, collaboration, and invention commensurate with the kind of challenges the world faces today.

Over the coming year and beyond, we'll see gen AI become more useful, with greater transparency around how things work, what they cost, and how best to deploy them to create breakthrough experiences. In this way, hype will give way to genuine value and delight.

Plenty of businesses have started, and others are looking to engage with AI. There are lots of ways to learn, from video overviews and industry basics and training tutorials or classes and certifications. Engagement can be as simple as trying an out-of-the-box solution for collaboration or in improving the performance of a call center. Once you and your organization see how easy it is to get started, I’m confident your creativity will unlock even more use cases and experiences that advance us all.

Opening image created with Midjourney, running on Google Cloud, using the prompt "three pillars of gen AI adoption in a business-y watercolor style."

Posted in