ORCID: https://orcid.org/0000-0003-3955-3510; Muschalik, Maximilian
ORCID: https://orcid.org/0000-0002-6921-0204; Hüllermeier, Eyke
ORCID: https://orcid.org/0000-0002-9944-4108 und Hammer, Barbara
ORCID: https://orcid.org/0000-0002-0935-5591
(4. October 2023):
On Feature Removal for Explainability in Dynamic Environments.
ESANN 2023 - European Symposium on Artificial Neural Networks, Bruges, Belgium, 4-6 October 2023.
In: Proceedings of ESANN 2023,
pp. 83-88
[PDF, 1MB]
Abstract
Removal-based explanations are a general framework to provide feature importance scores, where feature removal, i.e. restricting a model on a subset of features, is a central component. While many machine learning applications require dynamic modeling environments, where distributions and models change over time, removal-based explanations and feature removal have mainly been considered in a static batch learning environment. Recently, an interventional and observational perturbation method was presented that allows to remove features efficiently in dynamic learning environments with concept drift. In this paper, we compare these two algorithms on two synthetic data streams. We showcase how both yield substantially different explanations when features are correlated and provide guidance on the choice based on the application.
| Item Type: | Conference or Workshop Item (Paper) |
|---|---|
| Faculties: | Mathematics, Computer Science and Statistics > Computer Science > Artificial Intelligence and Machine Learning |
| Subjects: | 000 Computer science, information and general works > 004 Data processing computer science |
| URN: | urn:nbn:de:bvb:19-epub-118345-6 |
| Language: | English |
| Item ID: | 118345 |
| Date Deposited: | 25. Jun 2024 05:55 |
| Last Modified: | 22. Nov 2024 08:55 |
| DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 38445824 |
