shapiq.explainer.tabular#

This module contains the interaction explainer for the shapiq package. This is the main interface for users of the shapiq package.

Classes

TabularExplainer(model, data[, ...])

The tabular explainer as the main interface for the shapiq package.

class shapiq.explainer.tabular.TabularExplainer(model, data, approximator='auto', index='k-SII', max_order=2, random_state=None, **kwargs)[source]#

Bases: Explainer

The tabular explainer as the main interface for the shapiq package.

The TabularExplainer class is the main interface for the shapiq package. It can be used to explain the predictions of a model by estimating the Shapley interaction values.

Parameters:
  • model – The model to explain as a callable function expecting a data points as input and returning the model’s predictions.

  • data (ndarray) – A background dataset to use for the explainer.

  • approximator (Union[str, Approximator]) – An approximator to use for the explainer. Defaults to “auto”, which will automatically choose the approximator based on the number of features and the number of samples in the background data.

  • index (str) – Type of Shapley interaction index to use. Must be one of “SII” (Shapley Interaction Index), “k-SII” (k-Shapley Interaction Index), “STI” (Shapley-Taylor Interaction Index), or “FSI” (Faithful Shapley Interaction Index). Defaults to “k-SII”.

index#

Type of Shapley interaction index to use.

data#

A background data to use for the explainer.

baseline_value#

A baseline value of the explainer.

explain(x, budget=None)[source]#

Explains the model’s predictions.

Parameters:
  • x (ndarray) – The data point to explain as a 2-dimensional array with shape (1, n_features).

  • budget (Optional[int]) – The budget to use for the approximation. Defaults to None, which will set the budget to 2**n_features based on the number of features.

Return type:

InteractionValues

property baseline_value: float#

Returns the baseline value of the explainer.