Enterprise-ready generative AI with Apigee

Use Apigee’s API management platform to govern and secure your AI applications and provide developers with a consistent way to build LLM APIs and access LLM models across multiple clouds. 

Secure and scale your AI applications with Apigee

Learn more about Apigee's specific AI capabilities designed for the development, seamless application integration, and effective scaling of AI solutions.

Power agentic workflows grounded in enterprise context

AI agents use capabilities from LLMs to accomplish tasks for end users. These agents can be built using a variety of tools—from no-code and low-code platforms like Agentspace to full-code frameworks like LangChain or LlamaIndex. Apigee acts as an intermediary between your AI application and its agents.

Now with generally available Gemini Code Assist in Apigee, you can rapidly create API specifications using natural language in both Cloud Code and Gemini Chat interfaces. Leveraging your organization's API ecosystem through API Hub provides Enterprise Context for consistent, secure APIs with nested object support and proactive duplicate API detection.

Capabilities include:

  • Apigee API hub, to catalog first-party and third-party APIs and workflows
  • Token limit enforcement for cost controls
  • Multi-agent orchestration
  • Authentication and authorization
  • Semantic caching, for optimized performance
  • API specification generation with Enterprise Context (using Gemini Code Assist)



Keep your AI applications performant and highly available

Apigee provides deep observability for AI applications, to help you manage and optimize generative AI costs and efficiently govern your organization’s AI usage.

Capabilities include:

  • Visibility into model usage and application token consumption
  • Internal cost reporting and optimization
  • Custom dashboards to monitor usage based on actual token counts (through integration with Looker Studio
  • Logging for AI applications

Secure LLM APIs, while protecting your end users and your brand

Apigee acts as a secure gateway for LLM APIs, allowing you to control access with API keys, OAuth 2.0, and JWT validation, and prevent abuse and overload by enforcing rate limits and quotas. Apigee Advanced API Security can offer even more advanced protection.

Capabilities:

  • Model Armor integration and policy enforcement for prompt and response sanitization
  • Authentication and authorization 
  • Abuse and anomaly detection
  • Mitigation against OWASP Top 10 API and LLM security risks
  • LLM API security misconfiguration detection
  • Google SecOps and 3P SIEM integration 




Is your API ecosystem gen AI ready?

Streamline development and deploy models faster with our robust API management platform.

Apigee AI solutions for your use case

Govern AI applications with Apigee

Manage developer access to LLM models across clouds, and gain visibility into model usage and token consumption for cost reporting and optimization.

Secure AI applications with Apigee

Use Apigee as a policy enforcement point for Model Armor prompt and response sanitization, and mitigate OWASP LLM and API top 10 security risks.

Monitor AI applications with Apigee

Create custom dashboards to monitor LLM APIs in near-real time, making sure they’re available and performing as expected to maintain uninterrupted service.

Monetize AI applications with Apigee

Add a rate plan to charge developers for using your AI application’s APIs, and configure the rate plan to share your API revenue with the developers.

Ready to unlock the full potential of gen AI?

Unlock the potential of generative AI for your solutions with Apigee. Leverage Apigee's robust operationalization features to build secure, scalable, and optimized deployments with confidence.

Ready to begin? Explore our Apigee generative AI samples page.


Google Cloud