You can request multiple human labelers to annotate each piece of your data. In cases where there is disagreement on labeling, we will get additional opinions from the other labelers until there is consensus or we have reached the maximum number of labelers that you have set.
For example, if you request 3 labelers:
- For image classification tasks, we will have all 3 labelers classify each image and use the majority vote to decide the final answer.
- For image bounding box tasks, we will have the first labeler draw the boxes and the second labeler verify them. If the second labeler disagrees and makes any edits, we will continue to the third one to get a majority opinion.
In addition, we encourage you to ramp up your data labeling jobs incrementally. Start your first labeling job with a small amount of data, and then see whether the results are what you expect. Revise your instructions according to the feedback and the results you have received, and then create subsequent jobs to iterate until you feel comfortable with sending larger quantities of data. This will help you get high quality results and make the best use of your budget.
While the operation is running, you see a
progressPercent
field
indicating the progress (if it's not shown, the progress is 0%). When the operation is
complete, the response includes the value "done": true
.You also receive an email whenever an operation completes.
projects/sample_project_id/datasets/test_dataset_id/annotatedDatasets/sample_id
;
the ID is the value that appears after annotatedDataSets/
.- Your task hasn't been picked up yet, due to a high volume of requests. The task is queued and will be started as soon as possible.
- You requested multiple labelers per item and not all labelers have labeled any data items. For example, if you requested three labelers, a data item is marked complete only after all three labelers have finished labeling it. Even if all data items have been labeled by one or two labelers, the progress percentage would remain at zero.