shapiq.explainer.tree.treeshapiq#
This module contains the tree explainer implementation.
Classes
|
The explainer for tree-based models using the TreeSHAP-IQ algorithm. |
- class shapiq.explainer.tree.treeshapiq.TreeSHAPIQ(model, max_order=2, min_order=1, interaction_type='k-SII', verbose=False)[source]#
Bases:
object
The explainer for tree-based models using the TreeSHAP-IQ algorithm. For a detailed presentation of the algorithm, see the original paper: https://arxiv.org/abs/2401.12069.
TreeSHAP-IQ is an algorithm for computing Shapley Interaction values for tree-based models. It is heavily based on the Linear TreeSHAP algorithm (outlined in https://proceedings.neurips.cc/paper_files/paper/2022/hash/a5a3b1ef79520b7cd122d888673a3ebc-Abstract-Conference.html) but extended to compute Shapley Interaction values up to a given order. TreeSHAP-IQ needs to visit each node only once and makes use of polynomial arithmetic to compute the Shapley Interaction values efficiently.
- Parameters:
model (
Union
[dict
,TreeModel
,Any
]) – A single tree-based model to explain. Note unlike the TreeExplainer class, TreeSHAP-IQ only supports a single tree model. The tree model can be a dictionary representation of the tree, a TreeModel object, or any other tree model supported by the shapiq.explainer.tree.validation.validate_tree_model function.max_order (
int
) – The maximum interaction order to be computed. An interaction order of 1 corresponds to the Shapley value. Any value higher than 1 computes the Shapley interaction values up to that order. Defaults to 2.min_order (
int
) – The minimum interaction order to be computed. Defaults to 1.interaction_type (
str
) – The type of interaction to be computed. The interaction type can be “k-SII” (default), “SII”, “STI”, “FSI”, or “BZF”. All indices apart from “BZF” will reduce to the “SV” (Shapley value) for order 1.verbose (
bool
) – Whether to print information about the tree during initialization. Defaults to False.
Note
This class is not intended to be used directly. Instead, use the TreeExplainer class to explain tree-based models which internally uses then the TreeSHAP-IQ algorithm.
- explain(x)[source]#
- Computes the Shapley Interaction values for a given instance x and interaction order.
This function is the main explanation function of this class.
- Parameters:
x (np.ndarray) – Instance to be explained.
- Returns:
The computed Shapley Interaction values.
- Return type:
InteractionValues