Guide to integrating an explanation method in FAIRXAI
This guide describes the steps required to integrate a new explanation method (explainer) into the FAIRXAI framework.
—
1. Architecture Overview
The FAIRXAI framework uses an Adapter-based system. Each explanation method must be encapsulated in a class that acts as a bridge between the specific library (e.g., SHAP, LIME, Grad-CAM) and the project’s standard interface.
The auto-discovery system automatically loads all adapters located in the
fairxai.explain.adapter package.
—
2. Implementing the Adapter
To add a new explainer, create a new Python file in the
fairxai/explain/adapter/ directory (e.g., my_explainer_adapter.py).
Step A: Inheritance and Metadata
The class must inherit from GenericExplainerAdapter
and define three key attributes:
- explainer_name: str
Unique identifier used in YAML configuration files.
- supported_datasets: List[str]
List of supported dataset types (e.g.,
["image"],["tabular"], or["*"]for all).
- supported_models: List[str]
List of supported model types (matching model class names, e.g.,
["Conv2d"],["RandomForestClassifier"], or["*"]).
Step B: Required Methods
You must implement the following methods:
- __init__(self, model, dataset)
Initializes the explainer with the model (BBox wrapper) and the dataset.
- explain_instance(self, instance, params=None)
Generates a local explanation for a single instance.
- Parameters:
instance – The data instance to explain.
params – Optional parameters dictionary.
- Returns:
A list of
GenericExplanationobjects.
- explain_global(self)
Generates a global explanation for the model. If unsupported, raise
NotImplementedError.
Code Example
from typing import List, Optional, Dict, Any
from fairxai.explain.adapter.generic_explainer_adapter import GenericExplainerAdapter
from fairxai.explain.explaination.feature_importance_explanation import FeatureImportanceExplanation
class MyExplainerAdapter(GenericExplainerAdapter):
explainer_name: str = "my_method"
supported_datasets: List[str] = ["tabular"]
supported_models: List[str] = ["*"]
def __init__(self, model, dataset):
super().__init__(model, dataset)
# Specific initialization (e.g., loading external library)
def explain_instance(self, instance, params: Optional[Dict[str, Any]] = None) -> List[FeatureImportanceExplanation]:
# 1. Explanation computation logic
# 2. Format importance data (e.g., { "feature_1": 0.8, "feature_2": 0.1 })
importances = {"age": 0.7, "income": 0.3}
# 3. Return a standard Explanation object
return [
FeatureImportanceExplanation(
explainer_name=self.explainer_name,
data=importances,
global_scope=False
)
]
def explain_global(self):
raise NotImplementedError("Global explanation not supported.")
—
3. Handling Results (Explanation)
The framework expects the explain_* methods to return a list of objects inheriting from
GenericExplanation.
Available subclasses
FeatureImportanceExplanationFor explanations based on feature importance scores (pixels, tabular columns, etc.). Optionally supports avisualizationdictionary for visual assets (e.g., Grad-CAM overlays).RuleExplanationFor rule-based explanations (e.g., SHAP, LORE).CounterfactualExplanationFor counterfactual-based explanations.
If your method produces a radically different output, you should create a new subclass of
GenericExplanation in
fairxai/explain/explaination/.
—
4. Registration and Verification
Manual class registration is not required. Thanks to the dynamic discovery mechanism:
Place your file in
fairxai/explain/adapter/.When the
Projectstarts, the framework will automatically load your adapter.You can verify the loading by checking the logs:
INFO: Found X compatible explainers.
—
5. Usage via Configuration
Once integrated, the new explainer can be invoked through a YAML pipeline file:
pipeline:
- explainer: "my_method"
mode: "local"
params:
instance_index: 0
—
Developer Checklist
[ ] Does the class inherit from
GenericExplainerAdapter?[ ] Is
explainer_nameunique and descriptive?[ ] Are
supported_datasetsandsupported_modelscorrectly set?[ ] Is the file saved in the
adapterfolder?[ ] Are the results encapsulated in
GenericExplanation(or derived) objects?