Conversational product filtering

When conversational product filtering is enabled, Vertex AI Search for commerce guides shoppers through their product search on merchandiser sites using conversation. After an initial text query in Vertex AI Search for commerce, the online shopper gets a relevant follow-up question and multiple choice options. The follow-up question can either be answered by the user in free text or by clicking on a conversational multiple choice option.

If conversational product filtering is enabled, follow-up questions on the site drive a conversation that ensues until one of the three following scenarios occur:

  • A preconfigured minimum product count is reached (a conversation is not useful when only two products show up).
  • The user clicks on a product and adds it to their cart (the objective).
  • Search and browse for commerce runs out of AI-generated questions.

Conversational search user journey Figure 1. Conversational search user journey.

Alternative to dynamic facets

Dynamic facets are associated with broad queries which have low revenue per query. Customers can become overwhelmed when they see tens of thousands of results, creating the risk that they abandon their search experience. In particular, search queries that return high product counts have an unusually low revenue per query. Conversational search is able to refine queries and can be used in conjunction with dynamic facets. Conversational product filtering offers some advantages over dynamic facets, being more human, more interactive, and using less on-page real estate.

Customizable generative questions adapted to preferences

Conversational product filtering encourages a human-in-the-loop interaction with the generative AI questions by allowing retailers to preliminarily edit, overwrite, or deselect AI-generated questions according to their preferences, based on the uploaded catalog. Questions can be edited or disabled individually or in bulk in the Search for commerce console or the API in order to tailor the questions they want to appear in the search.

Admin experience

Manage the generative questions and conversational product filtering directly in the API, or in the conversational commerce console, and set it up in the Data quality and Evaluate sections of the Search for commerce.

Cloud console

The console allows retailers to manage generative questions in a conversational Vertex AI Search for commerce experience. Learn more about using generative questions in conversational product filtering.

Steps to use the generative question service

  1. Satisfy data requirements.

  2. Configure manual overrides.

  3. Turn the feature on.

  4. Preview and test.

Data requirements

To find out if your search data is ready for conversational product filtering, in the console, under Conversational product filtering and browse, or under Data quality > Conversation, go to the Coverage checks tab.

To enable conversational product filtering, you need to meet certain data requirements.

These are:

  • 1,000 queries per day: After you reach this first threshold, a conversation plan is generated that evaluates your inputs and outputs:
  • Inputs: filter count in events
  • Outputs: conversational coverage
  • 25% conversational coverage: Calculated by Vertex AI Search for commerce models, conversational coverage means the percentage of queries that have one question. A frequency-weighted 25% (by volume) of queries should have at least a first question that matches it.

If you don't have 25% conversational coverage yet, but have the first prerequisite 1000 queries per day, blocking and advisory checks begin to be applied to your outputs and inputs, respectively. Here, Vertex AI Search for commerce begins to calculate by how much of a percentage your user-event-applied filters have to increase in order to reach the 25% conversational coverage threshold. The more filters that are uploaded, the higher the coverage reached.

To view your conversational readiness:

  1. Go to the Conversation tab in the Data quality page in the Search for commerce console. This provides you with a critical check of whether a minimum of 25% of search queries have at least one follow-up question, as well as advisory checks as to what percentage of user events with valid filters is needed to reach that conversational coverage goal.

  2. If you pass the critical check, with sufficient user events with valid filters, proceed to the next step.

  3. To control how generative questions are served, go to the Conversational product filtering and browse page in the Vertex AI Search for commerce console.

Generative question controls

The generative AI writes a question for every indexable attribute in the catalog, using both names and values of attributes for system and custom attributes. These questions are generated by an LLM and aim to enhance the search experience. For example, for furniture type, values can be indoor or outdoor, the AI synthesizes a question about what type of furniture you are looking for.

Each facet has one generated question. Based on historic user events and facet engagement from past search event data, the questions are sorted by expected frequency of the question appearing. The AI first looks at the questions on top, then finds what is relevant by attribute. The list of questions is generated once. If a new attribute is added, it will be reflected in the list in two hours.

  1. Go to the Conversational search and browse page in the Search for commerce console.

    Go to the Conversational search and browse page.

  2. Under the Manage AI generated questions tab, view all the questions sorted by how often they are used, in query-weighted frequency, meaning how often they are served with common queries. The ranking uses the frequency field in the GenerativeQuestionConfig configuration. This field is responsible for sorting the AI-generated questions by how often they are used.

  3. You can use the filter option to filter the questions.

  4. Check the box to enable question visibility for each attribute.

  5. Click at the end of each row to open an edit panel for each question.

To make bulk edits, follow these steps:

  1. Select or clear the boxes next to the questions that you want to include or exclude in conversation.

  2. Click either the Allow in conversation or the Disallow in conversation buttons that appear at the top of the list. Alternatively, to edit an individual question, click and clear or recheck the box next to Allowed in conversation in the pane that opens:

Alt text

Use generative questions in conversational product filtering

The generative question service API provides controls to mitigate potential inconsistencies in the LLM output. These can be managed from the console. Here, retailers can also configure conversational product filtering by toggling its enabled state and setting the minimum number of products required to trigger it.

You can define the questions, specifying the question itself, potential answers, and whether the question is allowed in the conversation. Individual questions can be generated by an LLM or overridden by the retailer. The console supports reviewing AI-generated questions, allowing retailers to override them or toggle their conversational status. Questions can also be bulk edited.

Edit individual questions

You can also use controls to curate the individual questions. It is recommended to do this before you turn conversational product filtering on.

For each question, there are two options. Click in in the last column to access the questions visible to the users panel:

  1. Turn off a question for all queries: The question is enabled by default. Clear (or check again) the box next to Allowed in conversation. This option skips the question altogether. A retailer may opt to disable a question entirely if it does not relate to the queried attributes or could be misconstrued as inappropriate in some way (a question such as "What dress size are you looking for?" may be perceived as prying about a shopper's weight.)
  2. Rewrite a question: In the pane, you can see the AI-generated question, what attribute it is attached to and what values the attribute has. Click the pencil to rewrite it.

Turn on conversational filtering

After you have edited your generative AI questions in the console, you are ready to turn on conversational product filtering.

To enable conversational product filtering, go to the Conversational product filtering and browse page in the Search for commerce console.

  1. Go to the Conversational search and browse page in the Search for commerce console.

    Go to the Conversational search and browse page.

  2. In the Conversation section, navigate to the system-wide settings under the Configure and enable tab. This tab lets you configure the minimum products needed to match the query before a conversation can happen, thus when questions are generated. This minimum number is =>2. The minimum can be configured to be higher but never lower than 2. Consider the amount of products in your catalog you want to have returned in the search for users to begin a conversation. For example, a sweet spot for this number is one row to a page for minimum search results to trigger a conversation.

  3. Switch the toggle to on. This page also provides information as to the status of your blocking and advisory checks. If you have enough search queries with at least one follow-up question, your site is now conversational search enabled.

Evaluate and test

Evaluate lets you preview the serving experience by running a test search and testing your questions against displayed facets. This part of the console provides you with a preview of your serving experience with conversational product filtering.

To evaluate and test, follow these steps. In the Evaluate section on the Search or Browse tabs on the Evaluate page of the Search for commerce console.

  1. Go to the Evaluate page in the Search for commerce console.

    Go to the Evaluate page

  2. Click Search or Browse.

  3. In the Search Evaluation field, enter a test query that makes sense based on the catalog you have uploaded to search, such as shoes if your catalog consists of clothing items. Click Search preview to see search results. If you have conversational product filtering enabled, generative questions are enabled on the right panel.

  4. A list of test questions is available on the right panel.

Generative Question API

This section describes how to use the generative question API to integrate the conversational search API into your UI, manage the generative questions, and serve the feature on your site.

API integration

Objects:

  • GenerativeQuestionsFeatureConfig
  • GenerativeQuestionConfig
  • GenerativeQuestions Service
    • UpdateGenerativeQuestionsFeatureConfiguration
    • UpdateGenerativeQuestionConfig
    • ListGenerativeQuestionConfigs
    • GetGenerativeQuestionFeatureConfig
    • BatchUpdateGenerativeQuestionConfigs

The core to integrating this feature is defining the question resource. This includes the question itself and whether the question is allowed in the conversation. The question is by default generated by an LLM but can be overridden by the administrator.

Enable conversational product filtering

Object:

  • GenerativeQuestionsFeatureConfig

This object is a control configuration file for enabling the feature for generative questions to manage the overall serving experience of conversational product filtering. GenerativeQuestionsFeatureConfig uses a GET method to obtain attribute information and whether the attributes are indexable or not from the catalog associated with the project.

The feature_enabled switch determines whether questions are used at serving time. It manages the top-level toggles in the console.

Serving experience

Conversational product filtering is based on engaging the user with an ongoing conversation of multiple turns. Therefore, there is at least a second response required for conversational product filtering to work. The user is presented with a follow-up question and suggested answers in the response. The user can respond to this follow-up question either by entering their answer or by clicking on a suggested answer (multiple choice option).

  • Multiple choice The multiple choice option functions behind the scenes like a facet (an event type filter), which narrows the query using filtering. In the background, when the user clicks on a multiple choice response, a filter is applied to the query. Applying a filter using conversational multiple choice is identical to applying the same filter using dynamic facets or tiles.

  • Free text If the user responds in free text, a new and narrower query is generated. Learn more about how conversational product filtering enriches filter and user event capturing at the API level.

Service enabled by the feature

The generative questions service (service GenerativeQuestionService{...}) is used for managing LLM-generated questions. Its parent object is the catalog, where it retrieves information to return questions for a given catalog. The service is used to manage the overall generative question feature state, make individual or batch changes, and toggle questions on or off. Data requirements must be met to interface with the Service API and the questions need to first be initialized before they can be managed.

The service interacts with the feature level and question level configs with two sets of handlers:

  • GenerativeQuestionsFeatureConfig handlers (feature-level):

    1. Update lets you change minimum products and enable fields.
    2. Get returns an object.
  • GenerativeQuestion Config handlers (question-level):

    1. List returns all questions for a given catalog.
    2. Update performs individual question management.
    3. Batch Update performs grouped question management.

The service returns a semantically appropriate question based on the initial query.

A follow-up question is generated by the LLM model and can be overridden. The questions are displayed based on how likely it is used by customers by calling the search event history. If there is no search event history, the fallback is on the commerce search logs.

Different questions are generated based on the previous query. There are no fixed weights. The AI that drives the LLM-generated questions learns from the queries and changes the weighting for every query, so that "shirt", for example, weighs the category very heavily, but "XL red shirt" weighs category, size and color.

Configure the serving experience

Configure the serving experience by integrating the conversational filtering configuration API with the Search API.

User journey in the API

The conversational flow works as follows: The user initiates a search with an initial query and sets to mode in the new API the mode flag set to true. The user then selects an answer or provides free-text input, which is sent back to the API using the user_answer field. This new API provides additional_filter in the response. The user must apply these filters to the Search API follow-up request. the search results based on the user's input and provides a new follow-up question, prompting a follow-up query and continuing the conversation in multiple turns until the user finds what they're looking for on the retailer website.

Assuming conversational product filtering is enabled on the website, the user journey and subsequent interaction with Vertex AI Search for commerce follows this path:

  • Scenario 1. First query comes from user to both Search and Conversation API. The Search API only returns search results. The Conversation API returns the suggested answers and follow-up questions. Call the Search API for the same query or page_category and fetch the search results.
  • Step 1a. Follow-up conversation requested is sent to Conversational Search. Call Conversational API with the right conversation filtering mode.
  • Step 1b. Initial Search response with search results only. The Conversation API refines the query by returning the suggested answers and follow-up questions.
  • Scenario 2: User selects multiple choice.
  • Step 2a. Selected answer filter sent to the Conversation API.
  • Step 2b. Both Conversation and Search APIs run with applied filter.
  • Scenario 3: User selects free text.
  • Step 3a. Text answer sent to the Conversation API. Use the Conversational API to send the user answer.
  • Step 3b. The user gets a conversational follow-up question with some suggested answers in the Conversational response. Search run again with a modified query. Conversational API sends another question and additional_filter. This filter must be applied to the search results fetched from the Search API in the first step.

Scenario 1. First query comes from user

Conversational commerce is now supported only by the Conversational API. The conversationalFilteringMode in the Conversation API distinguishes between conversational commerce and conversational product filtering.

First, developers need to create the following search request by setting the product or item as the query, in this example, "dress":

Additional actions on the client side to enable conversationally filtered searches:

  • Developers must also create a conversational search request by setting "dress" as the query.

  • Developers must set mode to "CONVERSATIONAL_FILTER_ONLY" in order to get a conversational response. Otherwise, if it's set to "DISABLED", no follow-up question is supplied.

Step 1a. Retailer → search: Initial query with conversation enabled

Step 1b. Search → retailer: conversation ID, refined query, follow-up question, suggested answers

Conversational product filtering serves these options for continued conversational engagement, leading to faster search refinement:

Scenario 2: User selects a multiple choice option

If a user selected a multiple choice answer yellow:

  • Developers must restore the conversation_id from session storage.
  • Set mode to be CONVERSATIONAL_FILTER_ONLY.
  • Set user_answer for what user selects.

Step 2a. Retailer → search: selected answer filter

Step 2b. Search → retailer: filters applied

Scenario 3: User selects a free text input

If a user types in lavender:

  • Developers should restore the conversation_id from session storage.
  • Set followup_conversation_requested to be true.
  • Set user_answer for what user inputs (with the "text_answer:" prefix).

Step 3a. Retailer → search: text answer

Step 3b. Search → retailer: run with modified query