Jump to Content
AI & Machine Learning

Rapidly build an application in Gradio power by a Generative AI Agent

October 16, 2023
Carolina Hernandez

Cloud Engineer

Pushpdeep Gupta

Cloud Engineer

In our new age of low-code and no-code application development, AI has become the tool of choice for rapidly extending, powering and modernizing applications. With our ever-shifting technology landscape bringing new potential and opportunity to connect and engage with customers, or optimize and infuse insights and experiences, leading organizations are racing to build new applications faster. Whether it’s to embrace generative AI technologies, or maintain their competitive advantage, AI-infused application development is quickly becoming a necessity to to make it in today’s market .

In this blog, we will discuss how to use Gradio, an open source frontend framework, with Vertex AI Conversation. Vertex AI Conversation allows developers with limited machine learning skills to tap into the power of conversational AI technologies, and seamlessly develop gen AI proof-of-concept applications. With these two tools, organizations can deploy a PoC with an engaging, low-lift generative AI experience that wow your customers, and inspire your development team.

Gen AI powered chatbots can provide powerful and relative conversations by learning from your company’s own unstructured data. The Gradio front-end framework is an intuitive interface to build custom, interactive applications that allow developers to easily share and demo ML models.

Vertex AI Conversation

One of Gradio’s framework main capabilities is to create demo apps on top of your models with a friendly web interface so that anyone can use it and provide to your organization immediate feedback. Integrating a Gradio app with a generative AI agent built on Vertex AI Conversation unlocks key features allowing you to tweak and tune to your individual needs and feedback from users. Using the power of programmability, you can drive deep personalization and contextualization into your chatbot’s conversations with your customers using your organization’s data and demo them rapidly.

Gradio

With the unprecedented boom in generative AI, businesses need an accessible and seamless interface to validate their machine learning models, API, or data science workflow. Chatbots are a popular application of Large Language Models (LLMs). Because the interaction with LLMs feels natural and intuitive, businesses are turning to conversational interfaces such as voice-activated chatbots or voice bots. Voice bots are gaining popularity because of the convenience they bring; it’s much easier to speak than to type.

Gradio is an open-source Python framework that makes it easy to build quick interfaces like chatbots, voice-activated bots, and even full-fledged web applications to share your machine learning model, API or data science workflow with clients or collaborators. With Gradio, you can build quick demos and share them, all in Python with just a few lines of code. You can learn more about Gradio here.

Introducing a Gradio application that integrates with Vertex AI Conversation

Vertex AI Conversation’s data ingestion tools parse your content to create a virtual agent powered by LLMs. Your agent can then generate conversations using your organization’s data to provide a contextual and personal interaction with end-users. Seamless deployment through a web browser means demonstrating your application is easier than ever with the Gradio framework.

How it works

Gradio can be used to build chatbots that can answer user questions using a variety of data sources. To do this, you can build a middleware that uses Vertex AI Conversation to process the user's input and generate a response from an agent. The agent can then search for answers in a data store of documents, such as your company's knowledge base.

When the agent finds an answer, it can summarize it and present it to the user in the Gradio app. The agent can also provide links to the sources of the answer so that the user can learn more.

Here is a more detailed explanation of each step:

  1. The user asks the chatbot a question.
  2. The middleware sends the question to the genAI agent via Dialogflow API
  3. The genAI agent searches for answers in the data store.
  4. If the agent finds an answer, it summarizes it and provides links to the sources.
  5. The middleware sends the summary and links to the Gradio app via Dialogflow API.
  6. The Gradio app displays the summary and links to the user

The following diagram describes a high level architecture to be presented that can be used as a foundational building block for a MVP with core functionalities.

https://storage.googleapis.com/gweb-cloudblog-publish/images/Blog_GenAI_Gradio_-_Blog_2_HighLevel.max-2200x2200.jpg

The following is a description of the components of the chatbot architecture

  • Backend
    • Authentication: Verifies the user’s identity
    • Middleware: Orchestrates all requests and responses to generate answers
  • Generate Answer: Generates responses from a virtual agent grounded by the enterprise data. The underlying components or products are
    • Vertex AI
      • Vertex AI Conversation: Creation of generative AI agent capable of understanding and responding to natural language.
      • Dialogflow CX: Conversations are handled via Dialogflow.
    • Cloud Storage: Storage of the enterprise data
    • Data Store: Storage index data created automatically by Vertex AI Conversation to index the enterprise data and to allow Dialogflow to query it.
  • Speech to Text: Converts voice recordings from the user to text to be passed to Generate Answer.
  • Gradio Frontend
    • Chatbot: Provides a voice-activated chatbot that can understand both keyboard inputs and voice-based messages. The bot’s interface is built using the Gradio framework.
    • Speech Recording: Enables users to send voice-based messages.

User Interface

Once the application is launched, the interface will look like the image below.

https://storage.googleapis.com/gweb-cloudblog-publish/images/2-bot.max-2200x2200.jpg

Record from microphone: Allows users to send voice-based messages.
Start a new conversation: Erases chat history and initiates a new conversation.
Source: Displays links to the sources of the response, such as the user manual.

You can find examples of the implementation on Github repo genai-gradio-example.

The code is illustrating a deployable PoC application which demonstrates some basic functionalities implemented through Vertex AI and complemented by a Gradio custom UI UX portal. As next steps, we recommended exploring and generating ideas for user centric products that can be powered by Vertex AI in your company. Google can help.

Conclusion

In this post, we have discussed how to integrate your Gradio conversations with a generative AI agent using Vertex AI Conversation. This can be used to build rapid generative AI PoC applications, and begin the discussions within your organization for how you can harness the power of generative AI. We are also providing you with the high-level architecture for your application and sample code to get you started right away. We hope that this information will be helpful to developers who are looking to rapidly build gen AI-powered applications. While still in its early stages of development, gen AI is already changing how organizations connect, engage, and support their customers, and with such fast-shifting technologies, fortune favors the bold.

Posted in