nerc.ac.uk

Simulation‐based study design accuracy weights are not generalisable and can still lead to biased meta‐analytic inference: comments on Christie et al. (2019)

Pescott, Oliver L. ORCID: https://orcid.org/0000-0002-0685-8046; Stewart, Gavin B.. 2022 Simulation‐based study design accuracy weights are not generalisable and can still lead to biased meta‐analytic inference: comments on Christie et al. (2019). Journal of Applied Ecology, 59 (5). 1187-1190. https://doi.org/10.1111/1365-2664.14153

Before downloading, please read NORA policies.
[img]
Preview
Text
N532816JA.pdf - Published Version
Available under License Creative Commons Attribution 4.0.

Download (323kB) | Preview

Abstract/Summary

1. Variable study quality is a challenge for all the empirical sciences, but perhaps particularly for disciplines such as ecology where experimentation is frequently hampered by system complexity, scale and resourcing. The resulting heterogeneity, and the necessity of subsequently combining the results of different study designs, is a fundamental issue for evidence synthesis. 2. We welcome the recognition of this issue by Christie et al. (2019) and their attempt to provide a generic approach to study quality assessment and meta-analytic weighting through an extensive simulation study. However, we have reservations about the true generality and usefulness of their derived study ‘accuracy weights’. 3. First, the simulations of Christie et al. rely on a single approach to effect size calculation, resulting in the odd conclusion that before-after control-impact (BACI) designs are superior to randomised controlled trials (RCTs), which are normally considered the gold standard for causal inference. Second, the so-called ‘study quality’ scores have long been criticised in the epidemiological literature for failing to accurately summarise individual, study-specific drivers of bias and have been shown to be likely to retain bias and increase variance relative to meta-regression approaches that explicitly model such drivers. 4. Synthesis and applications. We suggest that ecological meta-analysts spend more time critically, and transparently, appraising actual studies before synthesis, rather than relying on generic weights or weighting formulas to solve assumed issues; sensitivity analyses and hierarchical meta-regression are likely to be key tools in this work.

Item Type: Publication - Article
Digital Object Identifier (DOI): https://doi.org/10.1111/1365-2664.14153
UKCEH and CEH Sections/Science Areas: Biodiversity (Science Area 2017-)
ISSN: 0021-8901
Additional Information. Not used in RCUK Gateway to Research.: Open Access paper - full text available via Official URL link.
Additional Keywords: causal inference, epidemiology, evidence synthesis, meta-analysis, meta-regression, multilevel modelling, quality scoring, study design
NORA Subject Terms: Ecology and Environment
Data and Information
Date made live: 28 Jun 2022 16:23 +0 (UTC)
URI: https://nora.nerc.ac.uk/id/eprint/532816

Actions (login required)

View Item View Item

Document Downloads

Downloads for past 30 days

Downloads per month over past year

More statistics for this item...