Explore open access research and scholarly works from NERC Open Research Archive

Advanced Search

Reflective error: a metric for assessing predictive performance at extreme events

Rouse, Robert Edwin ORCID: https://orcid.org/0009-0000-4601-0210; Moss, Henry; Hosking, Scott ORCID: https://orcid.org/0000-0002-3646-3504; McRobie, Allan; Shuckburgh, Emily. 2025 Reflective error: a metric for assessing predictive performance at extreme events. Environmental Data Science, 4, e26. 13, pp. 10.1017/eds.2025.16

Abstract
When using machine learning to model environmental systems, it is often a model’s ability to predict extreme behaviors that yields the highest practical value to policy makers. However, most existing error metrics used to evaluate the performance of environmental machine learning models weigh error equally across test data. Thus, routine performance is prioritized over a model’s ability to robustly quantify extreme behaviors. In this work, we present a new error metric, termed Reflective Error , which quantifies the degree at which our model error is distributed around our extremes, in contrast to existing model evaluation methods that aggregate error over all events. The suitability of our proposed metric is demonstrated on a real-world hydrological modeling problem, where extreme values are of particular concern.
Documents
539408:261491
[thumbnail of Open Access]
Preview
Open Access
reflective-error-a-metric-for-assessing-predictive-performance-at-extreme-events.pdf - Published Version
Available under License Creative Commons Attribution 4.0.

Download (1MB) | Preview
Information
Programmes:
BAS Programmes 2015 > AI Lab (2022-)
Library
Statistics

Downloads per month over past year

More statistics for this item...

Metrics

Altmetric Badge

Dimensions Badge

Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email
View Item