Configuration for how human labeling task should be done.
Required. A human-readable name for AnnotatedDataset defined
by users. Maximum of 64 characters .
Optional. A human-readable label used to logically group
labeling tasks. This string must match the regular expression
[a-zA-Z\d_-]{0,128}.
Optional. Replication of questions. Each question will be sent
to up to this number of contributors to label. Aggregated
answers will be returned. Default is set to 1. For image
related labeling, valid values are 1, 3, 5.
Optional. If you want your own labeling contributors to manage
and work on this labeling request, you can set these
contributors here. We will give them access to the question
types in crowdcompute. Note that these emails must be
registered in crowdcompute worker UI: https://crowd-
compute.appspot.com/
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-11-19 UTC."],[],[]]