This is the unified documentation for Retail API. This includes Recommendations AI, Retail Search, and the unified Retail console (which is applicable to both Recommendations AI and Retail Search users). To use the new console or Retail Search while they are in the restricted GA phase, submit a form here to contact Cloud sales. If you are using the v1beta version of Recommendations AI, migrate to the GA version: Migrating to the Retail API from beta.

To see documentation for only Recommendations AI and the Recommendations AI-only console, go to the How-to guides for Recommendations AI and the API reference documentation for Recommendations AI.

Importing catalog information

This page describes how to import your catalog information to the Retail API and keep it up to date.

The import procedures on this page apply to both Recommendations AI and Retail Search. Once you import data to the Retail API, both services are able to use that data, so you don't need to import the same data twice if you use both services.

Before you begin

Before you can import your catalog information, you must have completed the instructions in Before you begin, specifically setting up your project, creating a service account, and adding the service account to your local environment.

You must have the Retail Admin IAM role to perform the import.

Catalog import best practices

The Retail API requires high-quality data to generate high-quality results. If your data is missing fields or has placeholder values instead of actual values, the quality of your predictions and search results suffers.

When you import catalog data, ensure that you implement the following best practices:

  • Make sure you review the information about product levels before uploading any data.

    Changing product levels after you have imported any data requires a significant effort.

  • Observe the product item import limits.

    For bulk import from Cloud Storage, the size of each file must be 2 GB or smaller. You can include up to 100 files at a time in a single bulk import request.

    For inline import, import no more than 5,000 product items at a time.

  • Make sure that all required catalog information is included and correct.

    Do not use dummy or placeholder values.

  • Include as much optional catalog information as possible.

  • Make sure your events all use a single currency, especially if you plan to use Cloud Console to get revenue metrics. The Retail API does not support using multiple currencies per catalog.

  • Keep your catalog up to date.

    Ideally, you should update your catalog daily. Scheduling periodic catalog imports prevents model quality from going down over time. You can schedule automatic, recurring imports when you import your catalog using the Cloud Console. Alternatively, you can use Google Cloud Scheduler to automate imports.

  • Do not record user events for product items that have not been imported yet.

  • After importing catalog information, review the error reporting and logging information for your project.

    A few errors are expected, but if you have a large number of errors, you should review them and fix any process issues that led to the errors.

About importing catalog data

You can import your product data from Merchant Center, Cloud Storage, BigQuery, or specify the data inline in the request. Each of these procedures are one-time imports, with the exception of linking Merchant Center to the Retail API. Schedule regular catalog imports (ideally, daily) to ensure that your catalog is current. See Keeping your catalog up to date.

You can also import individual product items. For more information, see Uploading a product item.

Catalog import considerations

This section describes the methods that can be used for batch importing of your catalog data, when you might use each method, and some of their limitations.

Cloud Storage Description Import data in a JSON format from files loaded in a Cloud Storage bucket. Each file must be 2 GB or smaller and up to 100 files at a time can be imported. The import can be done using the Cloud Console or curl. Uses the Product JSON data format, which allows custom attributes.
When to use If you need to load a large amount of data in a single step.
Limitations Not ideal for catalogs with frequent inventory and pricing updates because changes are not reflected immediately.
BigQuery Description Import data from a previously loaded BigQuery table that uses the Retail schema. Can be performed using the Cloud Console or curl.
When to use If you have product catalogs with many attributes. BigQuery import uses the Retail schema, which has more product attributes than other import options, including key/value custom attributes.

If you have large volumes of data. BigQuery import does not have a data limit.

If you already use BigQuery.
Limitations Requires the extra step of creating a BigQuery table that maps to the Retail schema.
Merchant Center Syncing Description Imports catalog data through Merchant Center by linking the account with the Retail API. After linking, updates to catalog data in Merchant Center are synced to the Retail API in real time.
When to use If you have an existing integration with Merchant Center.
Limitations Limited schema support. For example, product collections are not supported by Merchant Center. Merchant Center becomes the source of truth for data until it is unlinked, so any custom attributes needed must be added to Merchant Center data.

Limited control. You cannot specify certain fields or sets of items to import from Merchant Center; all items and fields existing in Merchant Center are imported.
Inline import Description Import using a call to the Product.import method. Uses the ProductInlineSource object, which has fewer product catalog attributes than the Retail schema, but supports custom attributes.
When to use If you have flat, non-relational catalog data or a high frequency of quantity or price updates.
Limitations No more than 100 catalog items can be imported at a time. However, many load steps can be performed; there is no item limit.

Importing catalog data from Merchant Center

Merchant Center is a tool you can use to make your store and product data available for Shopping ads and other Google services.

You can import catalog data from Merchant Center in the following ways:

  • Bulk importing as a one-time procedure (Recommendations AI only).

  • Linking your Merchant Center account to the Retail API. Once linked, changes in your Merchant Center account are continually synced to the Retail API.

Bulk importing from Merchant Center

You can import catalog data from Merchant Center using the Retail Cloud Console or the products.import method. Bulk importing is a one-time procedure, and is only supported for Recommendations AI.

To import your catalog from Merchant Center, complete the following steps:

  1. Using the instructions in Merchant Center transfers, set up a transfer from Merchant Center into BigQuery.

    You'll use the Google Merchant Center products table schema. Configure your transfer to repeat daily, but configure your dataset expiration time at 2 days.

  2. If your BigQuery dataset is in another project, configure the required permissions so that the Retail API can access the BigQuery dataset. Learn more.

  3. Import your catalog data from BigQuery into the Retail API.

    Console

    1. Go to the Retail Data page in the Google Cloud Console.

      Go to the Data page

    2. Click Import to open the Import panel.

    3. Choose Product catalog.

    4. Select the branch you will upload your catalog to.

    5. Select Merchant Center as your data source and BigQuery as the upload method.

    6. Enter the BigQuery table where your data is located.

    7. (Optional) Enter the location of a Cloud Storage bucket in your project as a temporary location for your data.

      If not specified, a default location is used. If specified, the BigQuery and Cloud Storage bucket have to be in the same region.

    8. Choose whether to schedule a recurring upload of your catalog data.

    9. If this is the first time you are importing your catalog, select the product levels. Learn more about product levels.

      Changing product levels after you have imported any data requires a significant effort.

    10. Click Import.

    curl

    1. If this is the first time you are uploading your catalog, set your product levels by using the Catalog.patch method. This operation requires the Retail Admin role. Learn more about product levels.

      curl -X PATCH \
      -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
      -H "Content-Type: application/json; charset=utf-8" \
      --data '{
      "productLevelConfig": {
        "ingestionProductType": "PRODUCT_TYPE",
        "merchantCenterProductIdField": "PRODUCT_ID_FIELD"
      }
      }' \
      "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog"
    2. Import your catalog using the Products.import method.

      • DATASET_ID: The ID of the BigQuery dataset.
      • TABLE_ID: The ID of the BigQuery table holding your data.
      • STAGING_DIRECTORY: Optional. A Cloud Storage directory that is used as an interim location for your data before it is imported into BigQuery. Leave this field empty to let the Retail API automatically create a temporary directory (recommended).
      • ERROR_DIRECTORY: Optional. A Cloud Storage directory for error information about the import. Leave this field empty to let the Retail API automatically create a temporary directory (recommended).
      • dataSchema: For the dataSchema property, use value product_merchant_center. See the Merchant Center products table schema.

      We recommend you don't specify staging or error directories so that the Retail API can automatically create a Cloud Storage bucket with new staging and error directories. These are created in the same region as the BigQuery dataset, and are unique to each import (which prevents multiple import jobs from staging data to the same directory, and potentially re-importing the same data). After three days, the bucket and directories are automatically deleted to reduce storage costs.

      An automatically created bucket name includes the project ID, bucket region, and data schema name, separated by underscores (for example, 4321_us_catalog_retail). The automatically created directories are called staging or errors, appended by a number (for example, staging2345 or errors5678).

      If you specify directories, the Cloud Storage bucket must be in the same region as the BigQuery dataset, or the import will fail. Provide the staging and error directories in the format gs://<bucket>/<folder>/; they should be different.

      curl -X POST \
           -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
           -H "Content-Type: application/json; charset=utf-8" \
           --data '{
             "inputConfig":{
                "bigQuerySource": {
                  "datasetId":"DATASET_ID",
                  "tableId":"TABLE_ID",
                  "dataSchema":"product_merchant_center"
                }
              }
          }' \
         "https://retail.googleapis.com/v2/projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/0/products:import"
    

Syncing Merchant Center to the Retail API

For continuous synchronization between Merchant Center and Retail API, you can link your Merchant Center account to the Retail API. Once linked, the catalog information in your Merchant Center account is immediately imported to the Retail API.

While the Retail API is linked to the Merchant Center account, changes to your product data in the Merchant Center account are automatically updated within minutes in the Retail API. If you want to prevent Merchant Center changes from being synced to the Retail API, you can unlink your Merchant Center account.

Unlinking your Merchant Center account does not delete any products in the Retail API. To delete imported products, see Deleting product information.

To sync your Merchant Center account, complete the following steps.

Console

  1. Go to the Retail Data page in the Google Cloud Console.

    Go to the Data page

  2. Click Import to open the Import panel.

  3. Choose Product catalog.

  4. Select Merchant Center Sync as your data source, and select or add your Merchant Center account.

  5. Select the branch you will upload your catalog to.

  6. Click Import.

curl

  1. Check that the service account in your local environment has access to both the Merchant Center account and the Retail API. To check which accounts have access to your Merchant Center account, see User access for Merchant Center.

  2. Use the Catalog.patch method to establish the link:

    curl -X PATCH \
    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
    -H "Content-Type: application/json; charset=utf-8" \
     --data '{
        "merchantCenterLinkingConfig": {
          "links": {
            merchantCenterAccountId: MERCHANT_CENTER_ID,
            branchId: "BRANCH_ID"
          }
        }
     }' \
     "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog?updateMask=merchantCenterLinkingConfig"
    
    • MERCHANT_CENTER_ID: The ID of the Merchant Center account.
    • BRANCH_ID: The ID of the branch to estabilsh the link with. Accepts values '0', '1', or '2'.

To view your linked Merchant Center, go to the Cloud Console Data page and click the Merchant Center button on the top right of the page. This opens the Linked Merchant Center Accounts panel. You can also add additional Merchant Center accounts from this panel.

See Viewing aggregated information about your catalog for instructions on how to view the products that have been imported into the Retail API.

Unlinking your Merchant Center account stops that account from syncing catalog data to the Retail API. This procedure does not delete any products in the Retail API that have already been uploaded.

Console

  1. Go to the Retail Data page in the Google Cloud Console.

    Go to the Data page

  2. Click the Merchant Center button on the top right of the page to open a list of your linked Merchant Center accounts.

  3. Click Unlink next to the Merchant Center account you're unlinking, and confirm your choice in the dialog that appears.

curl

Use the Catalog.patch method to remove the linking configuration from the Catalog resource.

curl -X PATCH \
 -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
 -H "Content-Type: application/json; charset=utf-8" \
 --data '{
    "merchantCenterLinkingConfig": {}
 }' \
 "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog?updateMask=merchantCenterLinkingConfig"

Limitations on linking to Merchant Center

  • A Merchant Center account can be linked to any number of catalog branches, but a single catalog branch can only be linked to one Merchant Center account.

  • The first import after linking your Merchant Center account may take hours to finish. The amount of time depends on the number of offers in the Merchant Center account.

  • Any product modifications using the Retail API methods are disabled for branches linked to a Merchant Center account. Any changes to the product catalog data in those branches have to be made using Merchant Center. Those changes are then automatically synced to the Retail API.

  • The collection product type isn't supported for branches that use Merchant Center linking.

  • Your Merchant Center account can only be linked to empty catalog branches to ensure data correctness. In order to delete products from a catalog branch, see Deleting product information.

Importing catalog data from BigQuery

To import catalog data in the correct format from BigQuery, use the Retail schema to create a BigQuery table with the correct format and load the empty table with your catalog data. Then, upload your data to the Retail API.

For more help with BigQuery tables, see Introduction to tables. For help with BigQuery queries, see Overview of querying BigQuery data.

To import your catalog:

  1. If your BigQuery dataset is in another project, configure the required permissions so that the Retail API can access the BigQuery dataset. Learn more.

  2. Import your catalog data to the Retail API.

    Console

    1. Go to the Retail Data page in the Google Cloud Console.

      Go to the Data page

    2. Click Import to open the Import panel.

    3. Choose Product catalog.

    4. Select the branch you will upload your catalog to.

    5. Select BigQuery as the data source and Retail Product Catalogs Schema as the schema.

    6. Enter the BigQuery table where your data is located.

    7. (Optional) Enter the location of a Cloud Storage bucket in your project as a temporary location for your data.

      If not specified, a default location is used. If specified, the BigQuery and Cloud Storage bucket have to be in the same region.

    8. (Optional) Choose whether to schedule a recurring upload of your catalog.

      This option is only available if you have successfully imported your catalog to the Retail API at least once before.

    9. If this is the first time you are importing your catalog, select product levels. Learn more about product levels.

      Changing product levels after you have imported any data requires a significant effort.

    10. Click Import.

    curl

    1. If this is the first time you are uploading your catalog, set your product levels by using the Catalog.patch method. This operation requires the Retail Admin role.

      curl -X PATCH \
      -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
      -H "Content-Type: application/json; charset=utf-8" \
       --data '{
         "productLevelConfig": {
           "ingestionProductType": "PRODUCT_TYPE",
           "merchantCenterProductIdField": "PRODUCT_ID_FIELD"
         }
       }' \
      "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog"
      
    2. Create a data file for the input parameters for the import.

      Use the BigQuerySource object to point to your BigQuery dataset.

      • DATASET_ID: The ID of the BigQuery dataset.
      • TABLE_ID: The ID of the BigQuery table holding your data.
      • STAGING_DIRECTORY: Optional. A Cloud Storage directory that is used as an interim location for your data before it is imported into BigQuery. Leave this field empty to let the Retail API automatically create a temporary directory (recommended).
      • ERROR_DIRECTORY: Optional. A Cloud Storage directory for error information about the import. Leave this field empty to let the Retail API automatically create a temporary directory (recommended).
      • dataSchema: For the dataSchema property, use value product (default). You'll use the Retail schema.

      We recommend you don't specify staging or error directories so that the Retail API can automatically create a Cloud Storage bucket with new staging and error directories. These are created in the same region as the BigQuery dataset, and are unique to each import (which prevents multiple import jobs from staging data to the same directory, and potentially re-importing the same data). After three days, the bucket and directories are automatically deleted to reduce storage costs.

      An automatically created bucket name includes the project ID, bucket region, and data schema name, separated by underscores (for example, 4321_us_catalog_retail). The automatically created directories are called staging or errors, appended by a number (for example, staging2345 or errors5678).

      If you specify directories, the Cloud Storage bucket must be in the same region as the BigQuery dataset, or the import will fail. Provide the staging and error directories in the format gs://<bucket>/<folder>/; they should be different.

      {
         "inputConfig":{
           "bigQuerySource": {
             "datasetId":"DATASET_ID",
             "tableId":"TABLE_ID",
             "dataSchema":"product"}
            }
      }
      
    3. Import your catalog information to the Retail API by making a POST request to the Products:import REST method, providing the name of the data file (here, shown as input.json).

      curl -X POST \
      -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
      -H "Content-Type: application/json; charset=utf-8" -d @./input.json \
      "https://retail.googleapis.com/v2/projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/0/products:import"
      

      You can check the status programmatically using the API. You should receive a response object that looks something like this:

      {
      "name": "projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/operations/import-products-123456",
      "done": false
      }
      

      The name field is the ID of the operation object. To request the status of this object, replace the name field with the value returned by the import method, until the done field returns as true:

      curl -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
      "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/operations/import-products-123456"
      

      When the operation completes, the returned object has a done value of true, and includes a Status object similar to the following example:

      { "name": "projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/operations/import-products-123456",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.retail.v2.ImportMetadata",
        "createTime": "2020-01-01T03:33:33.000001Z",
        "updateTime": "2020-01-01T03:34:33.000001Z",
        "successCount": "2",
        "failureCount": "1"
      },
      "done": true
      "response": {
      "@type": "type.googleapis.com/google.cloud.retail.v2.ImportProductsResponse",
      },
      "errorsConfig": {
        "gcsPrefix": "gs://error-bucket/error-directory"
      }
      }
      

      You can inspect the files in the error directory in Cloud Storage to see if errors occurred during the import.

Setting up access to your BigQuery dataset

To set up access when your BigQuery dataset is in a different project than your Retail service, complete the following steps.

  1. Open the IAM page in the Cloud Console.

    Open the IAM page

  2. Select your Retail project.

  3. Find the service account with the name Retail Service Account.

    If you have not previously initiated an import operation with the Retail API, this service account might not be listed. If you do not see this service account, return to the import task and initiate the import. When it fails due to permission errors, return here and complete this task.

  4. Copy the identifier for the service account, which looks like an email address (for example, service-525@gcp-sa-retail.iam.gserviceaccount.com).

  5. Switch to your BigQuery project (on the same IAM & Admin page) and click Add.

  6. Enter the identifier for the Retail service account and select the BigQuery > BigQuery User role.

  7. Click Add another role and select BigQuery > BigQuery Data Editor.

    If you do not want to provide the Data Editor role to the entire project, you can add this role directly to the dataset. Learn more.

  8. Click Save.

Importing catalog data from Cloud Storage

To import catalog data in JSON format, you create one or more JSON files that contain the catalog data you want to import, and upload it to Cloud Storage. From there, you can import it to the Retail API.

For an example of the JSON product item format, see Product item JSON data format.

For help with uploading files to Cloud Storage, see Uploading objects.

  1. Make sure the Retail service account has permission to read and write to the bucket.

    The Retail service account is listed on the IAM page in the Cloud Console with the name Retail Service Account. Use the service account's identifier, which looks like an email address (for example, service-525@gcp-sa-retail.iam.gserviceaccount.com), when adding the account to your bucket permissions.

  2. Import your catalog data to the Retail API.

    Console

    1. Go to the Retail Data page in the Google Cloud Console.

      Go to the Data page

    2. Click Import to open the Import panel.

    3. Choose Product catalog.

    4. Select the branch you will upload your catalog to.

    5. Select Cloud Storage as your data source and Retail Product Catalogs Schema as the schema.

    6. Enter the Cloud Storage location of your data.

    7. (Optional) Enter the location of a Cloud Storage bucket in your project as a temporary location for your data.

      If not specified, a default location is used. If specified, the BigQuery and Cloud Storage bucket have to be in the same region.

    8. (Optional) Choose whether to schedule a recurring upload of your catalog.

      This option is only available if you have successfully imported your catalog to the Retail API at least once before.

    9. If this is the first time you are importing your catalog, select your product levels. Learn more about product levels.

      Changing product levels after you have imported any data requires a significant effort.

    10. Click Import.

    curl

    1. If this is the first time you are uploading your catalog, set your product levels by using the Catalog.patch method. Learn more about product levels.

      curl -X PATCH \
      -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
      -H "Content-Type: application/json; charset=utf-8" \
       --data '{
         "productLevelConfig": {
           "ingestionProductType": "PRODUCT_TYPE",
           "merchantCenterProductIdField": "PRODUCT_ID_FIELD"
         }
       }' \
      "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog"
      
    2. Create a data file for the input parameters for the import. Use the GcsSource object to point to your Cloud Storage bucket.

      You can provide multiple files, or just one; this example uses two files.

      • INPUT_FILE: A file or files in Cloud Storage containing your catalog data.
      • ERROR_DIRECTORY: A Cloud Storage directory for error information about the import.

      The input file fields must be in the format gs://<bucket>/<path-to-file>/. The error directory must be in the format gs://<bucket>/<folder>/. If the error directory does not exist, the Retail API creates it. The bucket must already exist.

      {
      "inputConfig":{
       "gcsSource": {
         "inputUris": ["INPUT_FILE_1", "INPUT_FILE_2"],
        }
      },
      "errorsConfig":{"gcsPrefix":"ERROR_DIRECTORY"}
      }
      
    3. Import your catalog information to the Retail API by making a POST request to the Products:import REST method, providing the name of the data file (here, shown as input.json).

      curl -X POST \
      -H "Authorization: Bearer $(gcloud auth application-default print-access-token)"
      -H "Content-Type: application/json; charset=utf-8" -d @./input.json"
      "https://retail.googleapis.com/v2/projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/0/products:import"
      

      The easiest way to check the status of your import operation is to use the Cloud Console. For more information, see Seeing status for a specific integration operation.

      You can also check the status programmatically using the API. You should receive a response object that looks something like this:

      {
      "name": "projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/operations/import-products-123456",
      "done": false
      }
      

      The name field is the ID of the operation object. You request the status of this object, replacing the name field with the value returned by the import method, until the done field returns as true:

      curl -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
      "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/operations/[OPERATION_NAME]"
      

      When the operation completes, the returned object has a done value of true, and includes a Status object similar to the following example:

      { "name": "projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/operations/import-products-123456",
      "metadata": {
        "@type": "type.googleapis.com/google.cloud.retail.v2.ImportMetadata",
        "createTime": "2020-01-01T03:33:33.000001Z",
        "updateTime": "2020-01-01T03:34:33.000001Z",
        "successCount": "2",
        "failureCount": "1"
      },
      "done": true
      "response": {
      "@type": "type.googleapis.com/google.cloud.retail.v2.ImportProductsResponse",
      },
      "errorsConfig": {
        "gcsPrefix": "gs://error-bucket/error-directory"
      }
      }
      

      You can inspect the files in the error directory in Cloud Storage to see what kind of errors occurred during the import.

Importing catalog data inline

curl

You import your catalog information to the Retail API inline by making a POST request to the Products:import REST method, using the productInlineSource object to specify your catalog data.

For an example of the JSON product item format, see Product item JSON data format.

  1. Create the JSON file for your product and call it ./data.json:

    {
    "inputConfig": {
    "productInlineSource": {
      "products": [
        {
          <product1>
        },
        {
          <product2>
        },
        ....
      ]
    }
    }
    }
    
  2. Call the POST method:

    curl -X POST \
     -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
     -H "Content-Type: application/json; charset=utf-8" \
     --data @./data.json \
    "https://retail.googleapis.com/v2/projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/0/products:import"
    

Java

public static String importProductsFromInlineSource(
    List<Product> productsToImport)
    throws IOException, InterruptedException, ExecutionException {
  ProductServiceClient productClient = getProductServiceClient();

  ProductInlineSource inlineSource = ProductInlineSource.newBuilder()
      .addAllProducts(productsToImport)
      .build();

  ProductInputConfig inputConfig = ProductInputConfig.newBuilder()
      .setProductInlineSource(inlineSource)
      .build();

  ImportProductsRequest importRequest = ImportProductsRequest.newBuilder()
      .setParent(IMPORT_PARENT)
      .setRequestId(REQUEST_ID)
      .setReconciliationMode(ReconciliationMode.INCREMENTAL)
      .setInputConfig(inputConfig)
      .build();

  String operationName = productClient
      .importProductsAsync(importRequest).getName();

  productClient.shutdownNow();
  productClient.awaitTermination(2, TimeUnit.SECONDS);

  return operationName;
}

Product item JSON data format

Your JSON file should look like the following examples. The line breaks are for readability; you should provide an entire product item on a single line. Each product item should be on its own line.

Minimum required fields:

{
  "id": "1234",
  "categories": "Apparel & Accessories > Shoes",
  "title": "ABC sneakers"
}
{
  "id": "5839",
  "categories": "casual attire > t-shirts",
  "title": "Crew t-shirt"
}

Complete object:

{
  "name": "projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/0/products/1234",
  "id": "1234",
  "categories": "Apparel & Accessories > Shoes",
  "title": "ABC sneakers",
  "description": "Sneakers for the rest of us",
  "attributes": { "vendor": {"text": ["vendor123", "vendor456"]} },
  "language_code": "en",
  "tags": [ "black-friday" ],
  "priceInfo": {"currencyCode": "USD", "price":100, "originalPrice":200, "cost": 50},
  "availableTime": "2020-01-01T03:33:33.000001Z",
  "availableQuantity": "1",
  "uri":"http://example.com",
  "images": [{"uri": "http://example.com/img1", "height": 320, "width": 320 }]
}

Historical catalog data

The Retail API supports importing and managing historical catalog data. Historical catalog data can be helpful when you use historical user events for model training. The Retail API can use past product information to enrich historical user event data and improve model accuracy.

Historical products are stored as expired products. They are not returned in search responses, but are visible to the Update, List, and Delete API calls.

Import historical catalog data

When a product's expireTime field is set to a past timestamp, this product is considered as a historical product. Set the product availability to OUT_OF_STOCK to avoid impacting Recommendations AI.

We recommend using the following methods for importing historical catalog data:

Calling the Product.Create method

Use the Product.Create method to create a Product entry with the expireTime field set to a past timestamp.

Inline import expired products

The steps are identical to Regular inline import, except that the products should have the expireTime fields set to a past timestamp.

An example of the ./data.json used in the inline import request:

{
"inputConfig": {
  "productInlineSource": {
      "products": [
        {
          "name": "historical product 001"
          "id": "historical_product_001"
          "name": "A historical product"
          "expire_time": {
            "second": 1000000000  // a past timestamp
          }
        },
        {
          <Another product>
        },
        ....
      ]
    }
  }
}

Import expired products from BigQuery or Cloud Storage

This procedure is similar to importing regular products from BigQuery or Cloud Storage. However, make sure to set the expireTime field to a past timestamp.

Keeping your catalog up to date

The Retail API relies on having current product information to provide you with the best results. We recommend that you import your catalog on a daily basis to ensure that your catalog is current. You can use Google Cloud Scheduler to schedule imports, or choose an automatic scheduling option when you import data using the Cloud Console.

You can update only new or changed product items, or you can import the entire catalog. If you import products that are already in your catalog, they are not added again. Any item that has changed is updated.

To update a single item, see Updating catalog information.

Batch updating

You can use the import method to batch update your catalog. You do this the same way you do the initial import; follow the steps in Importing catalog data.

Monitoring import health

To ensure there are no errors for your imported data, you can check the data load metrics for your catalog on the Retail Data page. The Retail Data page also shows quality metrics for the product data in your catalog.

Keeping your catalog up to date is important for getting high-quality results. You should monitor the import error rates and take action if needed. For more information, see Setting up alerts.

What's next