Logo Logo
Switch Language to German

Friedlaender, Felix, Bohmann, Ferdinand, Brunkhorst, Max, Chae, Ju-Hee, Devraj, Kavi, Köhler, Yvette, Kraft, Peter, Kuhn, Hannah, Lucaciu, Alexandra, Luger, Sebastian, Pfeilschifter, Waltraud, Sadler, Rebecca, Liesz, Arthur, Scholtyschik, Karolina, Stolz, Leonie, Vutukuri, Rajkumar and Brunkhorst, Robert (2017): Reliability of infarct volumetry: Its relevance and the improvement by a software-assisted approach. In: Journal of Cerebral Blood Flow and Metabolism, Vol. 37, No. 8: pp. 3015-3026

Full text not available from 'Open Access LMU'.


Despite the efficacy of neuroprotective approaches in animal models of stroke, their translation has so far failed from bench to bedside. One reason is presumed to be a low quality of preclinical study design, leading to bias and a low a priori power. In this study, we propose that the key read-out of experimental stroke studies, the volume of the ischemic damage as commonly measured by free-handed planimetry of TTC-stained brain sections, is subject to an unrecognized low inter-rater and test-retest reliability with strong implications for statistical power and bias. As an alternative approach, we suggest a simple, open-source, software-assisted method, taking advantage of automatic-thresholding techniques. The validity and the improvement of reliability by an automated method to tMCAO infarct volumetry are demonstrated. In addition, we show the probable consequences of increased reliability for precision, p-values, effect inflation, and power calculation, exemplified by a systematic analysis of experimental stroke studies published in the year 2015. Our study reveals an underappreciated quality problem in translational stroke research and suggests that software-assisted infarct volumetry might help to improve reproducibility and therefore the robustness of bench to bedside translation.

Actions (login required)

View Item View Item