Explore open access research and scholarly works from NERC Open Research Archive

Advanced Search

Issues in calibrating models with multiple unbalanced constraints: the significance of systematic model and data errors

Cameron, David ORCID: https://orcid.org/0000-0001-8938-0908; Hartig, Florian; Minnuno, Francesco; Oberpriller, Johannes; Reineking, Björn; Van Oijen, Marcel; Dietze, Michael. 2022 Issues in calibrating models with multiple unbalanced constraints: the significance of systematic model and data errors. Methods in Ecology and Evolution, 13 (12). 2757-2770. 10.1111/2041-210X.14002

Abstract
1. Calibrating process-based models using multiple constraints often improves the identifiability of model parameters, helps to avoid several errors compensating each other and produces model predictions that are more consistent with underlying processes. However, using multiple constraints can lead to predictions for some variables getting worse. This is particularly common when combining data sources with very different sample sizes. Such unbalanced model-data fusion efforts are becoming increasingly common, for example when combining manual and automated measurements. 2. Here we use a series of simulated virtual data experiments that aim to demonstrate and disentangle the underlying cause of issues that can occur when calibrating models with multiple unbalanced constraints in combination with systematic errors in models and data. We propose a diagnostic tool to help identify whether a calibration is failing due to these factors. We also test the utility of adding terms representing uncertainty in systematic model/data systematic error in calibrations. 3. We show that unbalanced data by itself is not the problem—when fitting simulated data to the ‘true’ model, we can correctly recover model parameters and the true dynamics of latent variables. However, when there are systematic errors in the model or the data, we cannot recover the correct parameters. Consequently, the modelled dynamics of the low data volume variables departs significantly from the true values. We demonstrate the utility of the diagnostic tool and show that it can also be used to identify the extent of the imbalance before the calibration starts to ignore the more sparse data. Finally, we show that representing uncertainty in model structural errors and data biases in the calibration can greatly improve the model fit to low-volume data, and improve coverage of uncertainty estimates. 4. We conclude that the underlying issue is not one of sample size or information content per se, despite the popularity of ad hoc approaches that focus on ‘weighting’ datasets to achieve balance. Our results emphasize the importance of considering model structural deficiencies and data systematic biases in the calibration of process-based models.
Documents
536819:219530
[thumbnail of N536819JA.pdf]
Preview
N536819JA.pdf - Published Version
Available under License Creative Commons Attribution 4.0.

Download (4MB) | Preview
Information
Library
Statistics

Downloads per month over past year

More statistics for this item...

Metrics

Altmetric Badge

Dimensions Badge

Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email
View Item