Abstract
Explainable Artificial Intelligence (XAI) has mainly focused on static learning tasks so far. In this paper, we consider XAI in the context of online learning in dynamic environments, such as learning from real-time data streams, where models are learned incrementally and continuously adapted over the course of time. More specifically, we motivate the problem of explaining model change, i.e. explaining the difference between models before and after adaptation, instead of the models themselves. In this regard, we provide the first efficient model-agnostic approach to dynamically detecting, quantifying, and explaining significant model changes. Our approach is based on an adaptation of the well-known Permutation Feature Importance (PFI) measure. It includes two hyperparameters that control the sensitivity and directly influence explanation frequency, so that a human user can adjust the method to individual requirements and application needs. We assess and validate our method’s efficacy on illustrative synthetic data streams with three popular model classes.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Publikationsform: | Publisher's Version |
Fakultät: | Mathematik, Informatik und Statistik > Informatik > Künstliche Intelligenz und Maschinelles Lernen |
Themengebiete: | 000 Informatik, Informationswissenschaft, allgemeine Werke > 000 Informatik, Wissen, Systeme |
URN: | urn:nbn:de:bvb:19-epub-94663-6 |
ISSN: | 0933-1875 |
Sprache: | Englisch |
Dokumenten ID: | 94663 |
Datum der Veröffentlichung auf Open Access LMU: | 16. Feb. 2023 14:35 |
Letzte Änderungen: | 11. Okt. 2024 14:08 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 491502892 |