đź§ Explain Variable Importance
Focus question: How do different Variables—and their interactions—impact my model’s predictions?
Global Importance
Understand which variables matter most overall in the model’s predictions.
Variable Importance
(Ranked Features by Average SHAP Value)
Variable Impact
(Beeswarm Plot: SHAP Values Across All Predictions)
Local Importance
Explain the model’s reasoning for a specific prediction.
Single Variable Effects
(Use Partial Dependence Plot to interpret average variable effect)
Note: This plot would be the current “interaction” plot, but focused on one Variable at a time. It would also include a histogram on the x-axis to show the distribution of the Variable—helping users see if variable importance is changing with limited data.
Explain Single Prediction
(Waterfall Plot for single SHAP Explanation)
Note: This is currently available in the SHAP package, and would work with an interface nearly identical to the prediction module. It shows how each Variable contributes to the final prediction for a specific instance.
Variable Interactions
Understand how variables work together to influence outcomes.
Visualize Variable Interactions
(Interaction strength Matrix)
Note: Possibly remove, as I’m not sure it’s being used properly.
Explore Interaction Effects
(Visualize Two-Way Partial Dependence to see how interactions affect predictions)
TBD Additional Interaction Methods
(Future Plots: Accumulated Local Effects, ICE, etc.)
Note: Add additional interaction effects methods here when available.