Investigate model performances for a range of features in your dataset, optimization strategies, and even manipulations to individual datapoint values using the What-If Tool integrated with Vertex AI.
Sample the prediction from trained machine learning models deployed to Vertex AI. Provide ground truth labels for prediction inputs using the continuous evaluation capability. Data Labeling Service compares model predictions with ground truth labels to help you improve model performance.
"Understanding how models arrive at their decisions is critical for the use of AI in our industry. We are excited to see the progress made by Google Cloud to solve this industry challenge. With tools like What-If Tool, and feature attributions in AI Platform, our data scientists can build models with confidence, and provide human-understandable explanations."
- Stefan Hoejmose, Head of Data Journeys, Sky
"Introspection of models is essential for both model development and deployment. Oftentimes we tend to focus too much on predictive skill when in reality it’s the more explainable model that is usually the most useful, and more importantly, the most trusted. We are excited to see these new tools made by Google Cloud, supporting both our data scientists and also our models customers."
- Erik Andrejko, Chief Technology Officer, wellio
"Model interpretability is critical to our ability to optimize AI and solve the problem in the best possible way. Google is pushing the envelope in Explainable AI through research and development. And with Google Cloud, we’re getting tried and tested technologies to solve the challenge of model interpretability and uplevel our data science capabilities."
- Aaron Davis, Chief Data Scientist, Vivint SmartHome
"We are leveraging neural networks to develop capabilities for future products. Easy-to-use, high-quality solutions that improve the training of our deep learning models are a prerogative for our efforts. We are excited to see the progress made by Google Cloud to solve problem of feature attributions and provide human-understandable explanations to what our models are doing."
- Chris Jones, Chief Technology Officer, iRobot
Increasing transparency with Google Cloud AI Explanations
AI Explanations for Vertex AI
BigQuery ML features and capabilities
AutoML Tables features and capabilities
Explaining model predictions on image data
Explaining model predictions on structured data
Code samples for Explainable AI
AI Explainability Whitepaper
Putting AI principles into action
Explainable AI tools are provided at no extra charge to users of AutoML Tables or Vertex AI. Note that Cloud AI is billed for node-hours usage, and running AI Explanations on model predictions will require compute and storage. Therefore, users of Explainable AI may see their node-hour usage increase.