Reflective error: a metric for assessing predictive performance at extreme events
Rouse, Robert Edwin ORCID: https://orcid.org/0009-0000-4601-0210; Moss, Henry; Hosking, Scott
ORCID: https://orcid.org/0000-0002-3646-3504; McRobie, Allan; Shuckburgh, Emily.
2025
Reflective error: a metric for assessing predictive performance at extreme events.
Environmental Data Science, 4, e26.
13, pp.
10.1017/eds.2025.16
Preview |
Text (Open Access)
© The Author(s), 2025. Published by Cambridge University Press. reflective-error-a-metric-for-assessing-predictive-performance-at-extreme-events.pdf - Published Version Available under License Creative Commons Attribution 4.0. Download (1MB) | Preview |
Abstract/Summary
When using machine learning to model environmental systems, it is often a model’s ability to predict extreme behaviors that yields the highest practical value to policy makers. However, most existing error metrics used to evaluate the performance of environmental machine learning models weigh error equally across test data. Thus, routine performance is prioritized over a model’s ability to robustly quantify extreme behaviors. In this work, we present a new error metric, termed Reflective Error , which quantifies the degree at which our model error is distributed around our extremes, in contrast to existing model evaluation methods that aggregate error over all events. The suitability of our proposed metric is demonstrated on a real-world hydrological modeling problem, where extreme values are of particular concern.
Item Type: | Publication - Article |
---|---|
Digital Object Identifier (DOI): | 10.1017/eds.2025.16 |
ISSN: | 2634-4602 |
Additional Keywords: | error metrics, extreme values, machine learning, natural hazards, statistics |
Date made live: | 09 May 2025 10:03 +0 (UTC) |
URI: | https://nora.nerc.ac.uk/id/eprint/539408 |
Actions (login required)
![]() |
View Item |
Document Downloads
Downloads for past 30 days
Downloads per month over past year