Trace the execution of a deployed agent

This page shows you how to enable Cloud Trace on your agent and view traces to analyze query response times and executed operations.

Enable tracing for LangchainAgent

To enable tracing for LangchainAgent, specify enable_tracing=True when you develop an agent. For example:

agent = agent_engines.LangchainAgent(
    model=model,                # Required.
    tools=[get_exchange_rate],  # Optional.
    enable_tracing=True,        # [New] Optional.
)

You can also enable tracing by updating a deployed agent. For example:

agent_engines.update(
    resource_name=RESOURCE_NAME,
    agent_engine=agent_engines.LangchainAgent(
        model=model,
        tools=[get_exchange_rate],
        enable_tracing=True,  # New, if it wasn't in the deployed agent with resource_name
    ),
)

This will export traces to Cloud Trace under the project in Set up your Google Cloud project.

Enable tracing for a custom agent

To enable tracing for custom agents, you can use Open Telemetry Google Cloud Integration in combination with an instrumentation framework such as OpenInference or OpenLLMetry.

View traces

Once enabled, you can find the traces in Trace Explorer.

Go to the Trace Explorer

The following Gantt chart shows a sample trace from a Langchain agent:

Sample Trace for a Query 

The first row in the Gantt chart is for the trace. Traces are composed of spans, which are records for a single function or sub-operation.

To learn more, see the documentation for Trace Explorer.

Quotas and limits

Some attribute values might get truncated when they reach quota limits. For more information, see Cloud Trace Quota.

Pricing

Cloud Trace has a free tier. For more information, see Cloud Trace Pricing.