Jump to Content
Transform with Google Cloud

The Prompt: Recapping Google Cloud Next and exploring AI's open future

September 11, 2023
https://storage.googleapis.com/gweb-cloudblog-publish/images/NEXT2023_0829_150953-1342_ALIVECOVERAGE.max-2600x2600.jpg
Philip Moyer

Global VP, AI & Business Solutions at Google Cloud

Generative AI in Google

Google brings new AI capabilities to developers & businesses

Read more

Business leaders are buzzing about generative AI. To help you keep up with this fast-moving, transformative topic, “The Prompt” brings you our ongoing observations from our work with customers and partners, as well as the newest AI happenings at Google. In this edition, Philip Moyer, global vice president for AI & Business Solutions at Google Cloud, shares his experiences at Google Cloud Next-- notably, why the future of AI ecosystems should be open. 

For the first time in four years, this summer, we hosted Google Cloud Next in-person—and one of the most gratifying things was the sheer breadth of people in attendance, from customers and technologists to analysts, reporters, and partners. There were too many conversations to recount, and scores of booths hosting different system integrators, software providers, consultants, and industry experts. Across it all, we talked to thousands of people about how they’re exploring generative AI in ways big and small.

To be at its best, generative AI needs to be open and accessible.

Philip Moyer, Global VP, AI & Business Solutions, Google Cloud

The experience drove home a discussion we’ve been having all year, and that’s crucial for enterprises charting their generative AI journeys: To be at its best, this technology needs to be open and accessible. 

Many models for many use cases

The most powerful proprietary foundation models continue to define the state of the art. But though such models are useful for many things, and the only way to achieve certain goals, their size and cost isn’t appropriate for all use cases. Moreover, the details of their underlying methodologies, training data, and architecture strategies typically aren’t disclosed. 

Consequently, powerful open-source foundation models such as Llama 2 from Meta and Technology Innovation Institute's Falcon are becoming very popular. They can offer organizations additional flexibility to balance cost against requirements, transparency that can be important in regulated industries, and clearer pathways to innovation, such as creating useful model variants efficient enough to run on local devices rather than in the cloud. 

This trend is one reason we believe the future is defined by choice, with organizations potentially using many models instead of doubling down on a single option. 

You probably saw this philosophy in action if you attended Next, where we not only updated capabilities and tuning options for our first-party models like PaLM 2, but also expanded the range of open-source and third-party models available in Vertex AI, our machine learning development platform. For example, Vertex AI is currently the only cloud platform to offer adapter tuning and Reinforcement Learning with Human Feedback for Llama 2. We believe this combination of first-party, third-party, and open-source models is crucial to helping organizations embrace generative AI on their own terms. 

Openness applies to not only models, but also data

Openness and choice around models is only part of the story, however. The same concepts apply for data. 

Wherever your data is, to activate it with the power of generative AI, you'll want to be able to leverage it with the models of your choice. 

This has a few dimensions. For example, a healthcare company I’ve recently worked with has billions of medical images in one cloud but wants to use models from a different provider. This is exactly the kind of multi-cloud use case we’ve targeted with solutions like Cross-Cloud Interconnect, which facilitate interconnectivity between Google Cloud and other providers, as well as our new database and analytics announcements, which make it easier than ever to connect generative foundation models with proprietary data. 

We’ve heard repeatedly from analysts and customers that this ability to combine data and tools across vendors provides crucial flexibility that most enterprises require. 

https://storage.googleapis.com/gweb-cloudblog-publish/images/NEXT2023_0829_142854-5174_ALIVECOVERAGE_bS.max-1700x1700.jpg

Another angle is how data and foundation models interact. For example, even if your organization were to finetune a model with your own data, the model wouldn’t have access to fresh information. Tuning can help a model better understand industry-specific language or produce content that matches a specific style or brand identity, but for day-to-day accuracy, a generative app needs to access the latest data, beyond what the model has been trained on. That’s why our Next announcements included capabilities like extensions, which can retrieve information and execute functions on behalf of users, and connectors, which can ingest data from enterprise software like Salesforce, Confluence, and JIRA. 

The specifics of different technologies aside, the point is, enterprises need the flexibility to activate data, no matter where it is, with generative models. That requires an open approach to not only choosing models, but also data across multi-cloud ecosystems—and if there is one thing we heard over and over at Next, it’s that organizations need this flexibility to envision and execute their generative AI plans. 

Open ecosystems lead to open opportunities 

Ultimately, we believe — and have heard loud and clear from customers, partners, and analysts — that organizations need open AI ecosystems in which data interconnectivity and a choice of models and tools are core pillars. This year’s Next continued our dedication to these ideals, and we’re looking forward to expanding these capabilities in coming weeks and months. In the meantime, if you missed this year’s conference, be sure to check out the keynote, below, and don’t miss our interview with five of the startups who’ve become “AI unicorns” while leveraging Google Cloud.
https://storage.googleapis.com/gweb-cloudblog-publish/images/Screenshot_2023-09-11_at_2.07.01_PM.max-2200x2200.png
Posted in