Edge TPU

Google’s purpose-built ASIC designed to run inference at the edge.

Description of what the video is about.

AI at the edge

AI is pervasive today, from consumer to enterprise applications. With the explosive growth of connected devices, combined with a demand for privacy/confidentiality, low latency, and bandwidth constraints, AI models trained in the cloud increasingly need to be run at the edge. Edge TPU is Google’s purpose-built ASIC designed to run AI at the edge. It delivers high performance in a small physical and power footprint, enabling the deployment of high-accuracy AI at the edge.

Learn more about the Edge TPU products from Coral

End-to-end AI infrastructure

End-to-end AI infrastructure

Edge TPU complements Cloud TPU and Google Cloud services to provide an end-to-end, cloud-to-edge, hardware + software infrastructure for facilitating the deployment of customers' AI-based solutions.

High performance in a small physical and power footprint

High performance in a small physical and power footprint

Thanks to its performance, small footprint, and low power, Edge TPU enables the broad deployment of high-quality AI at the edge.

Co-design of AI hardware, software and algorithms

Co-design of AI hardware, software and algorithms

Edge TPU isn't just a hardware solution, it combines custom hardware, open software, and state-of-the-art AI algorithms to provide high-quality, easy to deploy AI solutions for the edge.

A broad range of applications

A broad range of applications

Edge TPU can be used for a growing number of industrial use-cases such as predictive maintenance, anomaly detection, machine vision, robotics, voice recognition, and many more. It can be used in manufacturing, on-premises, healthcare, retail, smart spaces, transportation, etc.

An open, end-to-end infrastructure for deploying AI solutions

Edge TPU allows you to deploy high-quality ML inferencing at the edge, using various prototyping and production products from Coral.

The Coral platform for ML at the edge augments Google's Cloud TPU and Cloud IoT to provide an end-to-end (cloud-to-edge, hardware + software) infrastructure to facilitate the deployment of customers' AI-based solutions. In addition to its open-source TensorFlow Lite programming environment, the Coral platform provides a complete developer toolkit so you can compile your own models or retrain several Google AI models for the Edge TPU, combining Google's expertise in both AI and hardware.

Edge TPU complements CPUs, GPUs, FPGAs, and other ASIC solutions for running AI at the edge.

Infrastructure that works with you

By connecting edge tools with Google Cloud, you can deploy solutions that can bridge the functionality of the cloud with the availability of edge computing.

Edge TPU features

This ASIC is the first step in a road map that will leverage Google's AI expertise to follow and reflect in hardware the rapid evolution of AI.

See model benchmarks

Take the next step

Build with Edge TPU using the development board, which includes an Edge TPU SoM and a carrier board.

Need help getting started?
Work with a trusted partner
Continue browsing

Take the next step

Build with Edge TPU using the development board, which includes an Edge TPU SoM and a carrier board.

Need help getting started?
Work with a trusted partner
Get tips & best practices