TY - JOUR
T1 - Appraising prediction research: a guide and meta-review on bias and applicability assessment using the Prediction model Risk Of Bias ASsessment Tool (PROBAST)
AU - de Jong, Ype
AU - Ramspek, Chava L.
AU - Zoccali, Carmine
AU - Jager, Kitty J.
AU - Dekker, Friedo W.
AU - van Diepen, Merel
N1 - Funding Information: This study was supported by a grant from the Dutch Kidney Foundation (16OKG12). Publisher Copyright: © 2021 The Authors. Nephrology published by John Wiley & Sons Australia, Ltd on behalf of Asian Pacific Society of Nephrology.
PY - 2021/12
Y1 - 2021/12
N2 - Over the past few years, a large number of prediction models have been published, often of poor methodological quality. Seemingly objective and straightforward, prediction models provide a risk estimate for the outcome of interest, usually based on readily available clinical information. Yet, using models of substandard methodological rigour, especially without external validation, may result in incorrect risk estimates and consequently misclassification. To assess and combat bias in prediction research the prediction model risk of bias assessment tool (PROBAST) was published in 2019. This risk of bias (ROB) tool includes four domains and 20 signalling questions highlighting methodological flaws, and provides guidance in assessing the applicability of the model. In this paper, the PROBAST will be discussed, along with an in-depth review of two commonly encountered pitfalls in prediction modelling that may induce bias: overfitting and composite endpoints. We illustrate the prevalence of potential bias in prediction models with a meta-review of 50 systematic reviews that used the PROBAST to appraise their included studies, thus including 1510 different studies on 2104 prediction models. All domains showed an unclear or high ROB; these results were markedly stable over time, highlighting the urgent need for attention on bias in prediction research. This article aims to do just that by providing (1) the clinician with tools to evaluate the (methodological) quality of a clinical prediction model, (2) the researcher working on a review with methods to appraise the included models, and (3) the researcher developing a model with suggestions to improve model quality.
AB - Over the past few years, a large number of prediction models have been published, often of poor methodological quality. Seemingly objective and straightforward, prediction models provide a risk estimate for the outcome of interest, usually based on readily available clinical information. Yet, using models of substandard methodological rigour, especially without external validation, may result in incorrect risk estimates and consequently misclassification. To assess and combat bias in prediction research the prediction model risk of bias assessment tool (PROBAST) was published in 2019. This risk of bias (ROB) tool includes four domains and 20 signalling questions highlighting methodological flaws, and provides guidance in assessing the applicability of the model. In this paper, the PROBAST will be discussed, along with an in-depth review of two commonly encountered pitfalls in prediction modelling that may induce bias: overfitting and composite endpoints. We illustrate the prevalence of potential bias in prediction models with a meta-review of 50 systematic reviews that used the PROBAST to appraise their included studies, thus including 1510 different studies on 2104 prediction models. All domains showed an unclear or high ROB; these results were markedly stable over time, highlighting the urgent need for attention on bias in prediction research. This article aims to do just that by providing (1) the clinician with tools to evaluate the (methodological) quality of a clinical prediction model, (2) the researcher working on a review with methods to appraise the included models, and (3) the researcher developing a model with suggestions to improve model quality.
KW - clinical epidemiology
KW - epidemiology
KW - evidence-based medicine
KW - medical education
KW - meta-analysis
UR - http://www.scopus.com/inward/record.url?scp=85109391399&partnerID=8YFLogxK
U2 - https://doi.org/10.1111/nep.13913
DO - https://doi.org/10.1111/nep.13913
M3 - Review article
C2 - 34138495
SN - 1320-5358
VL - 26
SP - 939
EP - 947
JO - Nephrology (Carlton, Vic.)
JF - Nephrology (Carlton, Vic.)
IS - 12
ER -