Class HumanAnnotationConfig (1.9.0rc0)

HumanAnnotationConfig(mapping=None, *, ignore_unknown_fields=False, **kwargs)

Configuration for how human labeling task should be done.

Attributes

NameDescription
instruction str
Required. Instruction resource name.
annotated_dataset_display_name str
Required. A human-readable name for AnnotatedDataset defined by users. Maximum of 64 characters .
annotated_dataset_description str
Optional. A human-readable description for AnnotatedDataset. The description can be up to 10000 characters long.
label_group str
Optional. A human-readable label used to logically group labeling tasks. This string must match the regular expression [a-zA-Z\\d_-]{0,128}.
language_code str
Optional. The Language of this question, as a BCP-47 __. Default value is en-US. Only need to set this when task is language related. For example, French text classification.
replica_count int
Optional. Replication of questions. Each question will be sent to up to this number of contributors to label. Aggregated answers will be returned. Default is set to 1. For image related labeling, valid values are 1, 3, 5.
question_duration google.protobuf.duration_pb2.Duration
Optional. Maximum duration for contributors to answer a question. Maximum is 3600 seconds. Default is 3600 seconds.
contributor_emails MutableSequence[str]
Optional. If you want your own labeling contributors to manage and work on this labeling request, you can set these contributors here. We will give them access to the question types in crowdcompute. Note that these emails must be registered in crowdcompute worker UI: https://crowd-compute.appspot.com/
user_email_address str
Email of the user who started the labeling task and should be notified by email. If empty no notification will be sent.