Issues in calibrating models with multiple unbalanced constraints: the significance of systematic model and data errors

Cameron, David ORCID:; Hartig, Florian; Minnuno, Francesco; Oberpriller, Johannes; Reineking, Björn; Van Oijen, Marcel; Dietze, Michael. 2022 Issues in calibrating models with multiple unbalanced constraints: the significance of systematic model and data errors. Methods in Ecology and Evolution, 13 (12). 2757-2770.

Before downloading, please read NORA policies.
N536819JA.pdf - Published Version
Available under License Creative Commons Attribution 4.0.

Download (4MB) | Preview


1. Calibrating process-based models using multiple constraints often improves the identifiability of model parameters, helps to avoid several errors compensating each other and produces model predictions that are more consistent with underlying processes. However, using multiple constraints can lead to predictions for some variables getting worse. This is particularly common when combining data sources with very different sample sizes. Such unbalanced model-data fusion efforts are becoming increasingly common, for example when combining manual and automated measurements. 2. Here we use a series of simulated virtual data experiments that aim to demonstrate and disentangle the underlying cause of issues that can occur when calibrating models with multiple unbalanced constraints in combination with systematic errors in models and data. We propose a diagnostic tool to help identify whether a calibration is failing due to these factors. We also test the utility of adding terms representing uncertainty in systematic model/data systematic error in calibrations. 3. We show that unbalanced data by itself is not the problem—when fitting simulated data to the ‘true’ model, we can correctly recover model parameters and the true dynamics of latent variables. However, when there are systematic errors in the model or the data, we cannot recover the correct parameters. Consequently, the modelled dynamics of the low data volume variables departs significantly from the true values. We demonstrate the utility of the diagnostic tool and show that it can also be used to identify the extent of the imbalance before the calibration starts to ignore the more sparse data. Finally, we show that representing uncertainty in model structural errors and data biases in the calibration can greatly improve the model fit to low-volume data, and improve coverage of uncertainty estimates. 4. We conclude that the underlying issue is not one of sample size or information content per se, despite the popularity of ad hoc approaches that focus on ‘weighting’ datasets to achieve balance. Our results emphasize the importance of considering model structural deficiencies and data systematic biases in the calibration of process-based models.

Item Type: Publication - Article
Digital Object Identifier (DOI):
UKCEH and CEH Sections/Science Areas: Atmospheric Chemistry and Effects (Science Area 2017-)
ISSN: 2041-210X
Additional Information. Not used in RCUK Gateway to Research.: Open Access paper - full text available via Official URL link.
Additional Keywords: Bayesian inference, inverse modelling, model calibration, model discrepancy, multiple constraints, predictive uncertainty, structural model error, systematic data bias
NORA Subject Terms: Data and Information
Related URLs:
Date made live: 31 Jan 2024 13:41 +0 (UTC)

Actions (login required)

View Item View Item

Document Downloads

Downloads for past 30 days

Downloads per month over past year

More statistics for this item...