Tuesday, October 8, 2024

An own-goal in replication science--retraction of a paper that reported high replicability

  A 2023 paper reporting high replicability of psychology experiments has been retracted from Nature Human Behavior. The retraction notice says in part 
"The concerns relate to lack of transparency and misstatement of the hypotheses and predictions the reported meta-study was designed to test; lack of preregistration for measures and analyses supporting the titular claim (against statements asserting preregistration in the published article); selection of outcome measures and analyses with knowledge of the data; and incomplete reporting of data and analyses."

RETRACTED ARTICLE: High replicability of newly discovered social-behavioural findings is achievable

This article was retracted on 24 September 2024

Matters Arising to this article was published on 24 September 2024

This article has been updated

Abstract

Failures to replicate evidence of new discoveries have forced scientists to ask whether this unreliability is due to suboptimal implementation of methods or whether presumptively optimal methods are not, in fact, optimal. This paper reports an investigation by four coordinated laboratories of the prospective replicability of 16 novel experimental findings using rigour-enhancing practices: confirmatory tests, large sample sizes, preregistration and methodological transparency. In contrast to past systematic replication efforts that reported replication rates averaging 50%, replication attempts here produced the expected effects with significance testing (P < 0.05) in 86% of attempts, slightly exceeding the maximum expected replicability based on observed effect sizes and sample sizes. When one lab attempted to replicate an effect discovered by another lab, the effect size in the replications was 97% that in the original study. This high replication rate justifies confidence in rigour-enhancing methods to increase the replicability of new discoveries.

########


############

In general, I'm more optimistic about replications than preregistrations for identifying replicable results and testing hypotheses about them.  In this case, preregistration apparently revealed that what was written up as a replication study had begun as something else, and that the goal posts had been moved ex post, apparently in inappropriate ways.
######
Somewhat related are my posts on the Einstein Foundation Award for Promoting Quality in Research.

No comments: