Jump to Content
AI & Machine Learning

Betting on efficient AI: The 4 Ms

September 22, 2025
https://storage.googleapis.com/gweb-cloudblog-publish/images/Gen_16x9_49.max-2000x2000.jpg
Denise Pearl

Global GTM Practice Lead, Sustainability, Google Cloud

Building on the 4Ms framework, a Sustainable by Design AI infrastructure strategy resolves the apparent tension between innovation, profitability, and sustainability by making AI training and operation dramatically more cost- and energy-efficient.

Try Google Cloud

Start building on Google Cloud with $300 in free credits and 20+ always free products.

Free trial

In boardrooms everywhere, adopting AI is no longer a question of if, but how — and how fast. 

AI promises to help organizations secure a more competitive and profitable future, but the necessary foundational work — such as complex systems integration and meeting high standards for data security and governance — is already consuming organizational resources. As AI gains traction, CFOs are anticipating spikes in computational costs, CIOs are worried about already-stretched IT teams, and CSOs want to maintain focus on sustainability commitments. 

Innovation, profitability, sustainability, and efficiency appear to be competing priorities. However, the right AI infrastructure approach can deliver responsible resource usage, improved margins, and faster returns on investment, all while meeting sustainability principles.

Understanding AI’s resource requirements

What are AI’s real computational demands compared to common assumptions? This concern is one reason why measuring AI’s impact is a top investment priority for nearly a third of business leaders globally.

You may have seen studies claiming that a single AI model costs millions of dollars and produces emissions equivalent to five car lifetimes — and that’s just for training the model. Imagine the expense and environmental impact of using the same model for everyday business purposes. Any boardroom might hesitate.

According to Google Research calculations, organizations can reduce energy consumption by 100 to 1,000 times by following best practices such as moving from general-purpose processors in typical data centers to purpose-built AI infrastructure. This is like using a sledgehammer to crack a nut. Google Research proved that we can train the Evolved Transformer model — the one supposedly costing millions — for just $40. And those five car lifetimes of emissions? How about 0.00004 car lifetimes instead? That's 120,000 times less consumption.

How is this possible? Through “Sustainable by Design" AI architecture that follows Google’s 4Ms framework of best practices:

  • Using purpose-built AI processors.

  • Selecting efficient models fit for purpose.

  • Automating data and analytics at scale with zero-ops services.

  • Hosting workloads in cleaner energy locations where possible.

The compound effect of these improvements doesn't just trim costs at the margins — it makes running AI applications at scale economically feasible. Workloads that could cost millions to run on traditional infrastructure may require at most thousands or even hundreds of dollars on Sustainable by Design architecture. 

I’ll take “all of the above”

AI’s computational requirements become opportunities for operational improvement when approached strategically. In fact, adopting a cloud foundation that's Sustainable by Design embeds resource optimization directly into core operations for a triple victory:

Innovation accelerates. Companies with optimized AI infrastructure can run 100 experiments for the cost of one traditional training run, enabling rapid prototyping and testing of new applications.

Profitability improves. When training costs drop from millions to thousands, AI becomes viable for a broader range of business applications, expanding from operational tool to revenue driver.

Sustainability drives competitive advantage. Efficient AI operations provide resilience against energy price fluctuations, evolving regulations, and resource constraints, creating long-term strategic value.

The apparent tension between innovation, profitability, and resource efficiency dissolves with the right infrastructure approach. 

Strategic opportunities for resource optimization have always existed in technology adoption. Just as post-WWII constraints drove Toyota to develop lean manufacturing principles, today’s energy considerations are driving more efficient AI infrastructure design. Efficiency is on track to become standard practice in AI infrastructure, integrated into design rather than treated as a secondary consideration.

The question isn't whether to pursue AI innovation or sustainability. With Sustainable by Design infrastructure and the 4Ms framework, they can be the same thing.

Posted in