Abstract
Explainable artificial intelligence has mainly focused on static learning scenarios so far. We are interested in dynamic scenarios where data is sampled progressively, and learning is done in an incremental rather than a batch mode. We seek efficient incremental algorithms for computing feature importance (FI). Permutation feature importance (PFI) is a well-established model-agnostic measure to obtain global FI based on feature marginalization of absent features. We propose an efficient, model-agnostic algorithm called iPFI to estimate this measure incrementally and under dynamic modeling conditions including concept drift. We prove theoretical guarantees on the approximation quality in terms of expectation and variance. To validate our theoretical findings and the efficacy of our approaches in incremental scenarios dealing with streaming data rather than traditional batch settings, we conduct multiple experimental studies on benchmark data with and without concept drift.
Item Type: | Journal article |
---|---|
Faculties: | Mathematics, Computer Science and Statistics > Computer Science > Artificial Intelligence and Machine Learning |
Subjects: | 000 Computer science, information and general works > 000 Computer science, knowledge, and systems |
ISSN: | 0885-6125 |
Language: | English |
Item ID: | 108404 |
Date Deposited: | 13. Dec 2023, 14:49 |
Last Modified: | 13. Dec 2023, 14:49 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 438445824 |