When you want to create a new model, or recommendation, that model requires user event data for training. The amount of data required for training depends on the recommendation type and the optimization objective. See User event data requirements.
After you've set up event recording, it can take a considerable amount of time to record sufficient user event data to train your models. You can accelerate initial model training by importing user event data from past events. Before doing so, review the best practices for recording user events.
You can import your user events:
- From Cloud Storage.
- From BigQuery, if you have Enhanced Ecommerce and Google Analytics 360.
- Inline with the
eventStores.userEvents.import
method.
Importing user events from Cloud Storage
To import user events from Cloud Storage, create a file containing JSON data.
Make sure your JSON file:
- Is formatted according to its user event type. See User events for examples of each user event type format.
- Provides an entire user event on a single line, with no line breaks.
- Has each user event on its own line.
Importing user events inline
curl
You import user events inline by including the data for the events in your call
to the eventStores.userEvents.import
method.
The easiest way to do this is to put your user event data into a JSON file and provide the file to cURL.
For the formats of the user event types, see User events.
Create the JSON file:
{ "inputConfig": { "userEventInlineSource": { "userEvents": [ { <userEvent1>> }, { <userEvent2> }, .... ] } } }
Call the POST method:
curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data @./data.json \ "https://recommendationengine.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/catalogs/default_catalog/eventStores/default_event_store/userEvents:import"
Importing Google Analytics 360 user events with BigQuery
You can import user events with BigQuery if you have integrated Google Analytics 360 with BigQuery and use Enhanced Ecommerce.
The following procedures assume you are familiar with using BigQuery and Google Analytics 360.
Before you begin the next steps, make sure:
- You're using Enhanced Ecommerce.
- You have BigQuery linked to Analytics 360.
Check your data source
Make sure the user event data you will import is correctly formatted in a BigQuery table you have access to.
The table should have the format
project_id:ga360_export_dataset.ga_sessions_YYYYMMDD
.See the Google Analytics documentation for more about the table format.
In the BigQuery Google Cloud console, select the table from Resources to preview the table, and check that:
The
clientId
column has a valid value (for example,123456789.123456789
).Note that this value is different from the full _ga cookie value (which has a format such as
GA1.3.123456789.123456789
).The
hits.transaction.currencyCode
column has a valid currency code.
Check the consistency of item IDs between the uploaded catalog and the Analytics 360 user event table:
Use the
catalogItems.list
method to get catalog items. Check the ID field from the returned catalog items. (IgnoreitemGroupId
field.)export GOOGLE_APPLICATION_CREDENTIALS=/tmp/my-key.json curl \ -v \ -X GET \ -H "Content-Type: application/json; charset=utf-8" \ -H "Authorization: Bearer "$(gcloud auth application-default print-access-token)"" \ "https://recommendationengine.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/catalogs/default_catalog/catalogItems"
Compare the IDs in the
list
results to thehits.product.productSKU
column in the BigQuery table preview.
Import your events
Import your user events by including the data for the events in your call
to the eventStores.userEvents.import
method.
For dataSchema
, use the value user_events_ga360
.
export GOOGLE_APPLICATION_CREDENTIALS=/tmp/my-key.json
curl \
-v \
-X POST \
-H "Content-Type: application/json; charset=utf-8" \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token)"" \
"https://recommendationengine.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/catalogs/default_catalog/eventStores/default_event_store/userEvents:import" \
--data '{
"inputConfig": {
"bigQuerySource": {
"datasetId": "some_ga360_export_dataset",
"tableId": "ga_sessions_YYYYMMDD",
"dataSchema": "user_events_ga360"
}
}
}'
If your BigQuery dataset belongs to a different project from your Recommendations AI project, then follow the instructions in Setting up access to your BigQuery dataset to give your Recommendations AI service account a BigQuery Data Editor role for your BigQuery project. Modify the import request to specify the the BigQuery project ID:
"bigQuerySource": {
"projectId": "[GA360_BQ_PROJECT_ID, if different from PROJECT_ID]",
}