[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-04 (世界標準時間)。"],[],[],null,["# Get explanations\n\nThis guide describes how to get explanations from a `Model` resource on\nVertex AI. You can get explanations in two ways:\n\n- **Online explanations:** Synchronous requests to the Vertex AI API, similar to\n [online inferences](/vertex-ai/docs/predictions/overview) that return\n inferences with feature attributions.\n\n- **Batch explanations:** Asynchronous requests to the Vertex AI API that return\n inferences with feature attributions. Batch explanations are an optional part\n of [batch inference requests](/vertex-ai/docs/predictions/overview).\n\nBefore you begin\n----------------\n\nBefore getting explanations, you must do the following:\n\n1. This step differs depending on what type of machine learning model you use:\n\n - **If you want to get explanations from a custom-trained model,** then\n follow either [Configuring example-based explanations](/vertex-ai/docs/explainable-ai/configuring-explanations-example-based) or\n [Configuring feature-based explanations](/vertex-ai/docs/explainable-ai/configuring-explanations-feature-based)\n to create a `Model` that supports Vertex Explainable AI.\n\n - **If you want to get explanations from an AutoML tabular\n classification or regression model,** then [train an AutoML\n model on a tabular dataset](/vertex-ai/docs/training/automl-console). There is no\n specific configuration required to use Vertex Explainable AI. Explanations for\n forecasting models aren't supported.\n\n - **If you want to get explanations from an AutoML image\n classification model,** then [train an AutoML\n model on an image dataset](/vertex-ai/docs/training/automl-console) and\n [enable explanations when you deploy the model](/vertex-ai/docs/predictions/deploy-model-console).\n There is no specific configuration required to use Vertex Explainable AI.\n Explanations for object detection models aren't supported.\n\n2. If you want to get online explanations, [deploy the `Model` that you\n created in the preceding step to an `Endpoint`\n resource](/vertex-ai/docs/predictions/deploy-model-console).\n\nGet online explanations\n-----------------------\n\nTo get online explanations, follow most of the same steps that you would to get\nonline inferences. However, instead of sending a\n[`projects.locations.endpoints.predict`\nrequest](/vertex-ai/docs/reference/rest/v1/projects.locations.endpoints/predict) to the\nVertex AI API, send a [`projects.locations.endpoints.explain`\nrequest](/vertex-ai/docs/reference/rest/v1/projects.locations.endpoints/explain).\n\nThe following guides provide detailed instructions for preparing and sending\nonline explanation requests:\n\n- **For AutoML image classification models,**\n read [Getting online inferences from AutoML\n models](/vertex-ai/docs/predictions/overview#get_predictions_from_models).\n\n- **For AutoML tabular classification and regression models,**\n read [Get inferences from AutoML\n models](/vertex-ai/docs/predictions/overview#get_predictions_from_models).\n\n- **For custom-trained models,** read [Getting online inferences from\n custom-trained models](/vertex-ai/docs/predictions/get-online-predictions).\n\nGet batch explanations\n----------------------\n\nOnly feature-based batch explanations are supported; you cannot get\nexample-based batch explanations.\n\nTo get batch explanations, set the [`generateExplanation`\nfield](/vertex-ai/docs/reference/rest/v1/projects.locations.batchPredictionJobs#BatchPredictionJob.FIELDS.generate_explanation)\nto `true` when you create a batch inference job.\n\nFor detailed instructions about preparing and creating batch prediction jobs,\nread [Getting batch inferences](/vertex-ai/docs/predictions/batch-predictions-automl).\n\nGet Concurrent Explanations\n---------------------------\n\nExplainable AI supports concurrent explanations. Concurrent\nexplanations allow you to request both feature-based and example-based\nexplanations from the same deployed model endpoint without having to deploy your\nmodel separately for each explanation method.\n\nTo get concurrent explanations, upload your model and configure either\n[example-based](/vertex-ai/docs/explainable-ai/configuring-explanations-example-based)\nor [feature-based](/vertex-ai/docs/explainable-ai/configuring-explanations-feature-based)\nexplanations. Then, deploy your model as usual.\n\nAfter the model is deployed, you can request the configured explanations as usual.\nAdditionally, you can request concurrent explanations by specifying\n[`concurrent_explanation_spec_override`](/vertex-ai/docs/reference/rest/v1/projects.locations.endpoints/explain#body.request_body.FIELDS.concurrent_explanation_spec_override).\n\nNote the following when using concurrent explanations:\n\n- Concurrent explanations are available using only the `v1beta1` API version. If you're using the Vertex python SDK, you'll need to use the `preview` model to use concurrent explanations.\n- Example-based explanations cannot be requested after deploying with feature-based explanations. If you want both Example-based explanation and Feature-based explanations, deploy your model using Example-based explanations and request Feature-based using the concurrent explanation field.\n- Batch Explanations are not supported for Concurrent explanations. Online Explanations are the only way to use this feature.\n\nTroubleshooting\n---------------\n\nThis section describes troubleshooting steps that you might find helpful if you\nrun into problems with while getting explanations.\n\n### Error: list index out of range\n\nIf you get the following error message when requesting explanations: \n\n \"error\": \"Explainability failed with exception: listindex out of range\"\n\nMake sure that you are not passing an empty array into a field that expects an\narray of objects. For example, if `field1` accepts an array of objects, the\nfollowing request body might result in an error: \n\n {\n \"instances\": [\n {\n \"field1\": [],\n }\n ]\n }\n\nInstead, make sure the array is not empty, for example: \n\n {\n \"instances\": [\n {\n \"field1\": [\n {}\n ],\n }\n ]\n }\n\nWhat's next\n-----------\n\n- Based on the explanations you receive, learn how to [adjust your `Model` to\n improve explanations](/vertex-ai/docs/explainable-ai/improving-explanations).\n- [Try a sample notebook demonstrating Vertex Explainable AI\n on tabular data or image data](https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/main/notebooks/official/explainable_ai)."]]