Logo Logo
Hilfe
Hilfe
Switch Language to English

Muschalik, Maximilian ORCID logoORCID: https://orcid.org/0000-0002-6921-0204; Fumagalli, Fabian ORCID logoORCID: https://orcid.org/0000-0003-3955-3510; Jagtani, Rohit; Hammer, Barbara ORCID logoORCID: https://orcid.org/0000-0002-0935-5591 und Hüllermeier, Eyke ORCID logoORCID: https://orcid.org/0000-0002-9944-4108 (Juli 2023): iPDP: On Partial Dependence Plots in Dynamic Modeling Scenarios. World Conference on Explainable Artificial Intelligence (xAI 2023), Lisboa, Portugal, 26-28 July 2023. Longo, Luca (Hrsg.): Cham: Springer Nature Switzerland. S. 177-194

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

Post-hoc explanation techniques such as the well-established partial dependence plot (PDP), which investigates feature dependencies, are used in explainable artificial intelligence (XAI) to understand black-box machine learning models. While many real-world applications require dynamic models that constantly adapt over time and react to changes in the underlying distribution, XAI, so far, has primarily considered static learning environments, where models are trained in a batch mode and remain unchanged. We thus propose a novel model-agnostic XAI framework called incremental PDP (iPDP) that extends on the PDP to extract time-dependent feature effects in non-stationary learning environments. We formally analyze iPDP and show that it approximates a time-dependent variant of the PDP that properly reacts to real and virtual concept drift. The time-sensitivity of iPDP is controlled by a single smoothing parameter, which directly corresponds to the variance and the approximation error of iPDP in a static learning environment. We illustrate the efficacy of iPDP by showcasing an example application for drift detection and conducting multiple experiments on real-world and synthetic data sets and streams.

Dokument bearbeiten Dokument bearbeiten