Jump to Content
AI & Machine Learning

Beyond the chatbot: Building internal AI systems that power customer wins

September 18, 2025
https://storage.googleapis.com/gweb-cloudblog-publish/images/General_16x9_10.max-2600x2600.png
Amy Liu

Head of AI Solutions, Value Creation

Organizations find lasting competitive advantage by investing in two distinct areas of AI: highly visible External AI for customer engagement and foundational Internal AI to build employee fluency and drive everyday productivity.

Contact Sales

Discuss your cloud needs with our sales team.

Contact us

Ask any CEO about their AI strategy, and you’ll hear about chatbots, recommendation engines, and automated customer service. What you won’t hear about are the invisible AI systems that make those flashy projects actually work.

At Google, we have a unique vantage point on how organizations are navigating AI adoption. Across countless conversations with customers, a clear pattern has emerged. The most successful businesses are making two distinct types of AI investments: External AI and Internal AI.

While the flashy, headline-grabbing initiatives often get the spotlight and funding, we're seeing that long-term success hinges on mastering the foundational layer of internal operations first.

The visible layer: External-facing AI

External AI focuses on enhancing customer experience and engagement, directly impacting the top line. Examples include:

  • Customer Service Optimization: Verizon partnered with Google Cloud to integrate Gemini-powered AI into customer service operations, achieving a 40% increase in sales and a 95% answer rate for customer inquiries. Since deploying in July 2024 and scaling by January, the solution has improved both customer interactions and resolution times.

  • Embedding AI into Products: Adobe integrated Gemini 2.5 Flash Image into Firefly and Adobe Express, enabling social creators to generate graphics with consistent styling, helping marketers adapt visuals across formats, and allowing designers to rapidly prototype concepts. 

  • New business models: For a media company, this could be using generative AI to create novel short-form video content, opening up entirely new monetization capabilities that were previously too costly or time-consuming to explore.

These projects are typically championed and funded by core product teams and brought to life by official development and engineering resources. They are the visible, exciting applications that change how a company goes to market.

The foundation layer: Organizational AI fluency

If External AI is the visible structure, Internal AI is about building organizational AI fluency and empowering every employee across all functions. This category focuses on how people use AI in their daily work to drive productivity, creativity, and work satisfaction – making AI a valuable, accessible tool, rather than something to fear. A year ago, this was the primary focus for many enterprises. Today, we've seen executive attention and funding shift towards External AI initiatives that drive customer engagement.

This shift means that the responsibility for employee productivity tools often falls to central IT and individual business units. The technology is typically licensed on a per-user basis, and companies are navigating a crowded marketplace to find the right mix of tools for their teams. Users in this space aren't looking for massive, ground-up development projects; they want to use off-the-shelf solutions, saving their investment dollars for customization only where it’s truly essential.

Building this foundation is more complex than it appears. It's not just about buying licenses; it's about creating an ecosystem where AI can thrive. Success requires three foundational pillars.

The three pillars of building AI fluency

1. A culture of creativity and innovation: The biggest barrier to efficiency is often fear or stigma. When faced with change, people naturally assume the worst unless they are given clear information. In the case of AI, leaders must actively champion exploration. Employees shouldn't feel the need to hide their use of Gemini to draft an email or summarize a report. This behavior should be encouraged, discussed, and rewarded. The nature of knowledge work is changing, and every employee should feel empowered to discover how AI can evolve their role for the better. 

2. The right tools for experimentation: There's a growing disconnect between the AI tools employees use in their personal lives and the limited toolset available at work. This is especially acute in regulated industries like financial services and healthcare, where the default answer is often "no."

This needs to change. The role of central IT should be to thoroughly vet tools for security, privacy, and governance, and then provide wide access. The technology is evolving so rapidly that no central committee can possibly predict all the valuable use cases. The best way to find them is to let your employees — the people who know their own jobs best — experiment. Let users build for users, and you'll quickly see what sticks.

3. A system for opportunity identification and improvement: Once you've empowered your teams with the culture and tools to explore, you'll start to see innovation emerge from the ground up. You might see dozens or hundreds of "mini-applications" — a clever prompt chain a sales team uses to prep for calls, a simple workflow a legal team uses to review contracts, or a data analysis query an HR team builds to understand sentiment.

Some of these will have more traction than others. The job of IT and business leaders is to identify these grassroots successes and give them a chance to scale up. These innovations are invaluable. They provide direct, user-validated feedback that should inform the design of any larger, more formal platform changes. This ensures that enterprise-level development stays connected to real-world needs, meeting the high expectations of today's employees right where they are.

How we’re building Internal AI fluency at Google

We practice what we preach. Inside Google, our AI adoption follows the same patterns we see with our customers. While we have massive teams dedicated to External AI — building the next generation of AI into Search, Cloud, and our other products — we are just as focused on Internal AI for our own teams.

We foster a culture where using AI tools for daily tasks is the norm. Googlers are encouraged to use our own tools, like Gemini, to do everything from debugging code and writing documentation to summarizing long email threads and drafting project plans.

Here are some examples of foundational tools we’ve built using Google Workspace, Gemini 2.5 Pro & Canvas, Google Sites, and Appscript – created by non-developers who primarily work in slides and sheets. 

  • AI use case generator: Creates customer-ready slides and detailed accompanying documentation. Not only does it identify top AI use cases, it categorizes them into immediate vs. strategic categories based on complexity. Built by just 2 people over nights and weekends, ideation to launch took 4 weeks. Since launch, it’s seen over 5,000 generations in 2 months.

  • Prompt gallery: While GenAI is powerful, writing quality prompts can be time consuming. Our team created a library with our most frequently used prompts: Company Research, Ideation, Executive Positioning, Account Planning, including testing the limits with financial modeling tasks. Our peers often want to understand what’s under the hood, so this gives them a chance to use and edit the prompts themselves.

  • AI cost calculator: As AI tooling proliferates, the development and deployment cost of AI applications gets more complex. The team used to manage all of this in spreadsheets. We’ve since transferred some of our most complicated calculators into a web app using Gemini2.5 Pro Canvas. The workflow is much more intuitive, and the interface is significantly better than looking at a spreadsheet. 

This broad, internal access acts as a massive, real-world laboratory. We see how specific teams are crafting prompts to solve unique problems. For example, a marketing team might develop a specific prompt for Gemini that generates product descriptions that are perfectly on-brand and optimized for a specific region. When we see that usage pattern take off internally, it becomes a candidate for being productized — either as an internal tool or as a feature in a future Google Cloud product. This constant feedback loop between our internal users and our product teams ensures we aren't just building in a vacuum; we're solving real problems because we're solving our own problems first.

Success in AI isn’t about one breakthrough project. It’s about organizations that build strong foundations first, then use that infrastructure to power customer-facing innovation. The companies that master both layers – Internal and External AI – will be the ones that turn AI investments into lasting competitive advantage.

Posted in