[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-04 (世界標準時間)。"],[[["\u003cp\u003eDocument AI Human-in-the-Loop (HITL) is being deprecated and will no longer be available on Google Cloud after January 16, 2025, and new customers are not currently allowed access, so consider alternative solutions with Google Cloud certified partners.\u003c/p\u003e\n"],["\u003cp\u003eAfter configuring Human Review, managers will receive an email to add labelers to a specialist pool, which involves clicking a link to access the Manager page and manually assigning tasks to the individual labelers.\u003c/p\u003e\n"],["\u003cp\u003eTo initiate the review process, send a document through the created Processor, ensuring that the document either has poor extraction quality or the confidence threshold is set high, as uploading a document through the Google Cloud console will not trigger the review process.\u003c/p\u003e\n"],["\u003cp\u003eLabelers receive an email with a link to access the Interactive AI Human Review User Interface, where they can review, edit, or reject documents based on validation results, and then submit the corrected information, and all corrections will be recorded in the JSON file of the respective document.\u003c/p\u003e\n"],["\u003cp\u003eManagers can review documents either by directly accessing the Cloud Storage location or by using the URI provided in the prediction response, and can also re-allocate routing of future documents.\u003c/p\u003e\n"]]],[],null,["# Quickstart: Complete a Review Task\n==================================\n\n\n| **Caution** : Document AI Human-in-the-Loop is deprecated and will no longer be available on Google Cloud after January 16, 2025. New customers are not allowlisted. If you want to use (HITL) but don't see the option available, contact your Google Account team. \n|\n| To implement a human review and correction solution that meets your requirements, we recommend working with a Google Cloud certified partner like Devoteam, Searce, or Quantiphi. See [Deprecations](/document-ai/docs/deprecation) for details.\n\n\u003cbr /\u003e\n\nEach processor creates a \"HITL task\" that is assigned to a pool of human labelers (called \"Labeler Pool\") that review the documents processed by the processor. Once processed, these documents are queued up in the task for HITL review by the assigned Labeler Pool.\n\nBefore you begin\n----------------\n\nComplete the previous [Quickstart: Configure Human Review](/document-ai/docs/hitl/quickstart) before proceeding.\n\n\u003cbr /\u003e\n\nAdd Labelers\n------------\n\n1. After configuring Human Review, you should receive an email similar to the following:\n\n From: AI Platform \u003cnoreply-aiplatform@google.com\u003e\\\n Subject: Added to AI Platform SpecialistPool Test Labeler Group\n\n Hello AI Platform Customer,\n\n You are added as a manager to specialist pool\n cloudml_data_specialists_us_central1_785484070008756xxxx.\n To view details, visit Manager console\n https://datacompute.google.com/cm/\n\n Sincerely,\n The Google Cloud AI Team\n\n1. Click on the link in the email (or cut and paste into your browser) to navigate to the Manager page:\n\n2. The task created by the newly created Processor has the same name as the Processor (for example, \"Department A Invoices\") and is listed in the Tasks tab.\n\n3. Click on Labelers tab\n\n4. Click on Add Labeler, and add at least one labeler (Email). A manager can be a labeler, so you can add yourself.\n\n5. For a newly created Labeler Pool, the task must be assigned to the individual labeler explicitly in the Assignments tab. That is, adding the Labeler Pool doesn't automatically assign the task to these new labelers. If the Labeler Pool has already provisioned labelers, the pool is automatically assigned to the task.\n\nPost an Initial Document to Review\n----------------------------------\n\n1. Send a document for extraction thru the created Processor. Refer to [How-To Guides](/document-ai/docs/how-to) as needed. **Both online (sync) and batch (async) calls currently support Human Review routing for supported processors.**\n\n For test purposes, you can either use known poor extraction so as to trigger the Human Review by the confidence score being below the threshold, or you can set the threshold to 100%. Document limits are generally 5 pages, 20MB max, but check the specific Processor limits.\n | **Note:** Uploading a document through the Google Cloud console to test the processor **won't** trigger a Review process.\n\n There is an API to track each document routed to Human Review.\n As part of the [response from Processors](/document-ai/docs/reference/rest/v1/projects.locations.processors/process), there is a `humanReviewOperation`string that is the Operation ID (job name) of the document within Human Review. This [Long Running Operation](/document-ai/docs/long-running-operations) (also referred to as a LRO) can be queried for status.\n\n You can query the status of a document routed to Human Review with the [`projects.locations.operations.get`](/document-ai/docs/reference/rest/v1/projects.locations.operations/get) method\n | **Note:** The [`batchProcess` method](/document-ai/docs/reference/rest/v1/projects.locations.processors/batchProcess#BatchInputConfig) includes a `skipHumanReview` boolean: Whether the Human Review feature should be skipped for this request. Default to false.\n2. You can force a document to be Human Reviewed with the [`reviewDocument`](/document-ai/docs/reference/rest/v1/projects.locations.processors.humanReviewConfig/reviewDocument) method (note that it is Processor-specific):\n\n Using the `reviewDocument` method requires that the **Human Review ENABLED** checkbox be selected. Using this API will not override the config settings.\n\n You can follow the [Request Human Review](/document-ai/docs/hitl/request-review) guide to use this method.\n\nReview the document\n-------------------\n\n1. If the document triggers the human review process, but there are no Labelers in the pool, the manager will get an email.\n\n \u003e Hello AI Platform Customer,\n \u003e\n \u003e There is a running task for Project with Job ID 404298342016955xxxx,\\\n \u003e Labeling Task ID cloudml_data_us_central1_prod_11b273b4dabdxxxx,\\\n \u003e Display name \\\n \u003e datalabeling-job-8005676103232389120-for-hitl-340697311810578xxxx,\\\n \u003e replication count 1\\\n \u003e Please see attachment for instruction.\\\n \u003e Please add labelers and have them finish the task at Labeler console \\\n \u003e [https://datacompute.google.com/w/cloudml_data_specialists_us_central1_7854840700087566336](https://datacompute.google.com/w/cloudml_data_specialists_us_central1_7854840700087566336){: target=\"external\" class=\"external\" track-type=\"quickstart\" track-name=\"externalLink\" track-metadata-position=\"body\" }.\\\n \u003e To view details, visit Manager console \\\n \u003e [https://datacompute.google.com/cm/cloudml_data_specialists_us_central1_7854840700087566336/tasks](https://datacompute.google.com/cm/cloudml_data_specialists_us_central1_7854840700087566336/tasks){: target=\"external\" class=\"external\" track-type=\"quickstart\" track-name=\"externalLink\" track-metadata-position=\"body\" }.\n \u003e\n \u003e Sincerely,\\\n The Google Cloud AI Team\n\nThese links are for the Labeling Manager to allocate labeling work.\n\n1. If the labelers aren't enrolled in the Labeling Group and are newly added, the Labeling Manager must send a similar email with instructions to the Labelers added.\n\n | **Note:** The Manager may manually re-allocate routing of future new documents to labelling pools (tasks). See [Manage Labeling Tasks](/document-ai/docs/hitl/manage-labeling-tasks).\n\nLabeler Steps\n-------------\n\n### Labeler Procedure\n\n1. Labeler Email\n\n If a labeler is already enrolled to a Labeling Group in Step 2e, the system will auto-allocate, and they will get an email like this:\n\n The link in this email is the mechanism for the selected Labeler to do labeling work on the document.\n2. Labeling Actions:\n\n When a Labeler clicks the link to do the labeling, they will see the Interactive AI Human Review User Interface shown below. The highlighted (in yellow) rows indicate the rows that are failing the validation (that is, they are below the configured confidence score threshold) and need review.\n\n3. Review\n By hovering the cursor over a recognized text field, the recognized text and (In smaller font below it) the name of the field (schema) is shown. The row is also highlighted in the left panel.\n\n The user may zoom in/out with the magnifier icons at the top left of the toolbar.\n | **Note:** \\*\\*To scroll horizontally, hold down the shift key and use the scroll wheel on the mouse. \\*\\*This is not documented elsewhere.\\\\\n\n4. Change Field\n\n By (left) clicking on the field, you can edit it:\n\n You may change the value or label in the left panel. Then click **Apply**.\n\n OR, by clicking on one of the resize corner icons on the document image, you may resize the bounding box around the document text:\n\n As shown, if different text is selected, it will change the text in the field's Value in the left panel. You can then further edit this text if needed. Then click **Apply**.\n\n Be sure to review all pages, using the page control in the upper right corner.\n\nAfter making all needed label changes, click **SUBMIT** (at bottom left). The result json will be saved to the \"Results location\" folder.\n\nIf the document cannot be satisfactorily labeled, you may click **REJECT** (at bottom left), and then select a reason for rejecting the document:\n- Select a reason, then click **Reject document**.\n\nLabelers may also click on the following icons in the upper right corner:\n\n- **Analytics** - to show how many documents they have reviewed (Answers) and their total time.\n- **Notifications** - To show any Notifications they have gotten.\n- **More (3 Dots)** - To **Skip** the document, **Find Answers** , or provide **Feedback** (if enabled).\n- **Toggle Title bar (Diagonal Arrows)** - This will hide (or show) the DataCompute title bar for more room.\n\nLabeling Manager\n----------------\n\n### Review Document from Cloud Storage\n\nIf you want to check the document, there are two options:\n\n- Option 1:\n\n 1. Locate the Cloud Storage from previous configured location\n 2. Retrieve and download the documents\n- Option 2:\n\n 1. Remember the URI for retrieve the document from Prediction response\n 2. Call URI to retrieve the reviewed document\n\n| **Note:** Human Corrections will be recorded in the [`TextChanges`](/document-ai/docs/reference/rest/v1/Document#textchange) construct of the Document JSON.\n\nWhat's next\n-----------\n\n- [Manage labeling tasks](/document-ai/docs/hitl/manage-labeling-tasks) using the labeling manager console.\n- See a list and descriptions of currently [available processors](/document-ai/docs/processor-overview)."]]