This is the documentation for Recommendations AI only. To try Retail Search and the unified Retail console in the restricted GA phase, contact Cloud sales. If you are not planning to use Retail Search, remain on the Recommendations console until further notice.

If you are using the v1beta version of Recommendations AI, migrate to the Retail API version.

Importing historical user events

When you want to create a new model, or recommendation, that model requires user event data for training. The amount of data required for training depends on the recommendation type and the optimization objective. See User event data requirements.

After you've set up real-time event recording, it can take a considerable amount of time to record sufficient user event data to train your models. You can accelerate initial model training by importing user event data from past events in bulk. Before doing so, review the best practices for recording user events.

You can:

Importing user events from Cloud Storage

To import user events from Cloud Storage, create a file containing JSON data.

Make sure your JSON file:

  • Is formatted according to its user event type. See User events for examples of each user event type format.
  • Provides an entire user event on a single line, with no line breaks.
  • Has each user event on its own line.

Importing user events inline

curl

You import user events inline by including the data for the events in your call to the userEvents.import method.

The easiest way to do this is to put your user event data into a JSON file and provide the file to cURL.

For the formats of the user event types, see User events.

  1. Create the JSON file:

    {
    "inputConfig": {
      "userEventInlineSource": {
          "userEvents": [
            {
              <userEvent1>>
            },
            {
              <userEvent2>
            },
            ....
          ]
        }
      }
    }
    
  2. Call the POST method:

    curl -X POST \
         -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
         -H "Content-Type: application/json; charset=utf-8" \
         --data @./data.json \
      "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/userEvents:import"
    

Importing user events from BigQuery

Import user events from BigQuery using the userEvents.import method.

Your user events must be in BigQuery tables that are correctly formatted for Recommendations AI ingestion. The table schema differs depending on the user event type. See User event types and examples schema for the JSON schema to specify when creating BigQuery tables for each event type.

When importing your events, use the value user_event for dataSchema.

export GOOGLE_APPLICATION_CREDENTIALS=/tmp/my-key.json

curl \
  -v \
  -X POST \
  -H "Content-Type: application/json; charset=utf-8" \
  -H "Authorization: Bearer "$(gcloud auth application-default print-access-token)"" \
  "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/userEvents:import" \
  --data '{
    "inputConfig": {
      "bigQuerySource": {
        "datasetId": "DATASET-ID",
        "tableId": "TABLE-ID",
        "dataSchema": "user_event"
    }
  }
}'

If your BigQuery dataset belongs to a different project from your Recommendations AI project, then follow the instructions in Setting up access to your BigQuery dataset to give your Recommendations AI service account a BigQuery Data Editor role for your BigQuery project. Modify the import request to specify the the BigQuery project ID:

"bigQuerySource": {
     "projectId": "BQ_PROJECT_ID",
   }

Importing Google Analytics 360 user events with BigQuery

You can import Google Analytics 360 user events if you have integrated Google Analytics 360 with BigQuery and use Enhanced Ecommerce.

The following procedures assume you are familiar with using BigQuery and Google Analytics 360.

Before you begin the next steps, make sure:

Check your data source

  1. Make sure the user event data you will import is correctly formatted in a BigQuery table you have access to.

    The table should have the format project_id:ga360_export_dataset.ga_sessions_YYYYMMDD.

    See the Google Analytics documentation for more about the table format.

  2. In the BigQuery Google Cloud Console, select the table from Resources to preview the table, and check that:

    • The clientId column has a valid value (for example, 123456789.123456789).

      Note that this value is different from the full _ga cookie value (which has a format such as GA1.3.123456789.123456789).

    • The hits.transaction.currencyCode column has a valid currency code.

  3. Check the consistency of item IDs between the uploaded catalog and the Analytics 360 user event table.

    Using any product ID from the hits.product.productSKU column in the BigQuery table preview, use the product.get method to make sure the same product is in your uploaded catalog.

    export GOOGLE_APPLICATION_CREDENTIALS=/tmp/my-key.json
    
       curl \
         -v \
         -X GET \
         -H "Content-Type: application/json; charset=utf-8" \
         -H "Authorization: Bearer "$(gcloud auth application-default print-access-token)"" \
         "https://retail.googleapis.com/v2/projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/default_branch/products/PRODUCT_ID"
    

Import your Google Analytics 360 events

Import your user events by including the data for the events in your call to the userEvents.import method.

For dataSchema, use the value user_event_ga360.

export GOOGLE_APPLICATION_CREDENTIALS=/tmp/my-key.json

curl \
  -v \
  -X POST \
  -H "Content-Type: application/json; charset=utf-8" \
  -H "Authorization: Bearer "$(gcloud auth application-default print-access-token)"" \
  "https://retail.googleapis.com/v2/projects/[PROJECT_ID]/locations/global/catalogs/default_catalog/userEvents:import" \
  --data '{
    "inputConfig": {
      "bigQuerySource": {
        "datasetId": "some_ga360_export_dataset",
        "tableId": "ga_sessions_YYYYMMDD",
        "dataSchema": "user_event_ga360"
    }
  }
}'

If your BigQuery dataset belongs to a different project from your Recommendations AI project, then follow the instructions in Setting up access to your BigQuery dataset to give your Recommendations AI service account a BigQuery Data Editor role for your BigQuery project. Modify the import request to specify the the BigQuery project ID:

"bigQuerySource": {
     "projectId": "GA360_BQ_PROJECT_ID",
   }

What's next