Develop an application

An example of a small application that you can create using LangChain on Vertex AI is one that returns the exchange rate between two currencies on a specified date. The following steps show you how to create this application:

  1. Define and configure a model
  2. Define a function
  3. Use a LangChain agent to connect the model to the function
  4. Test the application

Before you begin

Before you run this tutorial, make sure your environment is set up by following the steps in Set up your environment.

Step 1. Define and configure model

Run the following steps to define and configure your model:

  1. To create your application, you need to define the model that you want to use. To learn more, see Model versions and lifecycle. Run the following command to use the Gemini 1.0 Pro Vision multimodal model.

    model = "gemini-1.0-pro"
  2. (Optional) You can configure the safety settings of the model. To learn more about the options available for configuring safety settings in Gemini, see Configure safety attributes.

    The following is an example of how you can configure the safety settings:

    from langchain_google_vertexai import HarmBlockThreshold, HarmCategory
    safety_settings = {
        HarmCategory.HARM_CATEGORY_UNSPECIFIED: HarmBlockThreshold.BLOCK_NONE,
        HarmCategory.HARM_CATEGORY_HATE_SPEECH: HarmBlockThreshold.BLOCK_ONLY_HIGH,
        HarmCategory.HARM_CATEGORY_HARASSMENT: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
        HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT: HarmBlockThreshold.BLOCK_NONE,
  3. (Optional) You can specify the following model parameters.

    • Temperature
    • Maximum output tokens
    • TopP
    • TopK
    • Safety settings (you must create your safety settings using the previous step first).

    To learn more about the options available for model parameter settings in Gemini, see Set model parameters. The following is an example of how you can specify model parameters:

model_kwargs = {
    # temperature (float): The sampling temperature controls the degree of
    # randomness in token selection.
    "temperature": 0.28,
    # max_output_tokens (int): The token limit determines the maximum amount of
    # text output from one prompt.
    "max_output_tokens": 1000,
    # top_p (float): Tokens are selected from most probable to least until
    # the sum of their probabilities equals the top-p value.
    "top_p": 0.95,
    # top_k (int): The next token is selected from among the top-k most
    # probable tokens.
    "top_k": 40,
    # safety_settings (Dict[HarmCategory, HarmBlockThreshold]): The safety
    # settings to use for generating content.
    "safety_settings": safety_settings,

Step 2. Define a Python function

After you define your model, the next step is to define the tools that your model uses for reasoning. A tool can be a LangChain tool or a Python function. You can also convert a defined Python function to a LangChain Tool. This application uses a function definition.

When you define your function, it's important to include comments that fully and clearly describe the function's parameters, what the function does, and what the function returns. This information is used by the model to determine which function to use. You must also test your function locally to confirm that it works.

Use the following code to define a function that returns an exchange rate:

def get_exchange_rate(
    currency_from: str = "USD",
    currency_to: str = "EUR",
    currency_date: str = "latest",
    """Retrieves the exchange rate between two currencies on a specified date.

    Uses the Frankfurter API ( to obtain
    exchange rate data.

        currency_from: The base currency (3-letter currency code).
            Defaults to "USD" (US Dollar).
        currency_to: The target currency (3-letter currency code).
            Defaults to "EUR" (Euro).
        currency_date: The date for which to retrieve the exchange rate.
            Defaults to "latest" for the most recent exchange rate data.
            Can be specified in YYYY-MM-DD format for historical rates.

        dict: A dictionary containing the exchange rate information.
            Example: {"amount": 1.0, "base": "USD", "date": "2023-11-24",
                "rates": {"EUR": 0.95534}}
    import requests
    response = requests.get(
        params={"from": currency_from, "to": currency_to},
    return response.json()

To test the function before you use it in your application, run the following:

get_exchange_rate(currency_from="USD", currency_to="SEK")

The response should be similar to the following:

{'amount': 1.0, 'base': 'USD', 'date': '2024-02-22', 'rates': {'SEK': 10.3043}}

Step 3. Use a LangChain orchestration template

An orchestration framework organizes the application code into one or more functions that specify application configuration parameters, application initialization logic, and runtime logic.

You can define your own Python class (see Customize an application template), or you can use the LangchainAgent class in the Vertex AI SDK for Python for your agent.

To use the LangchainAgent class, specify your model, defined function, and model parameters to instantiate a LangchainAgent object:

agent = reasoning_engines.LangchainAgent(
    model=model,  # Required.
    tools=[get_exchange_rate],  # Optional.
    model_kwargs=model_kwargs,  # Optional.

Step 4. Test the application

Now that you've created your application, it's time to test it. You can test the application by performing test queries against it. Run the following command to test the application using US dollars and Swedish Krona:

response = agent.query(
    input="What is the exchange rate from US dollars to Swedish currency?"

The response is a dictionary that's similar to the following:

{"input": "What is the exchange rate from US dollars to Swedish currency?",
 # ...
 "output": "For 1 US dollar you will get 10.7345 Swedish Krona."}

What's next