Friday, October 6, 2023

Correcting science faster by making replication easier and more fun, by Brodeur, Dreber, Hoces de la Guardia & Miguel

If imitation is the sincerest form of flattery, we need to think of replication as one of the sincerest forms of inquiry in social science. (I'm more optimistic about the potential role of replication than I am about pre-registration.)

Here's a Comment in Nature that points out that we're not going to get lots of replications unless we can make them easier and more fun than they have been traditionally.

Replication games: how to make reproducibility research more systematic.  In some areas of social science, around half of studies can’t be replicated. A new test-fast, fail-fast initiative aims to show what research is hot — and what’s not.  by Abel Brodeur, Anna Dreber, Fernando Hoces de la Guardia & Edward Miguel

"we decided to try to make replication efforts in our fields of economics and political science more systematic. Our virtual, non-profit organization, the Institute for Replication, now holds one-day workshops — called replication games — to validate studies.

"Since October 2022, we’ve hosted 12 workshops across Europe, North America and Australia, with 3 more scheduled this year. Each workshop has typically involved around 65 researchers in teams of 3–5 people, re-analysing about 15 papers. The teams either try to replicate papers, by generating new data and testing hypotheses afresh, or attempt to reproduce them, by testing whether the results hold if the published data are re-analysed. For many papers in our fields of study, in which the reproduction of results often involves re-running computer codes, it’s possible to do much of this work in a single day (see ‘A typical replication games project’). Each team’s findings are released as a preprint report, and these reports will be collated and published each year as a meta-paper. 

...

"To assess large numbers of papers, collaborating with research centres and universities is essential. For example, our current goal is to reproduce and replicate studies in journals that have a high impact factor — specifically, 25% of empirical studies published from 2022 onwards in 8 leading economics journals and 3 leading political science journals, totalling about 350 papers per year. Then we plan to expand into other areas of the social sciences.

...

"Broader partnerships can expand replication efforts beyond academic papers. Earlier this year, we were invited to run replication games with the International Monetary Fund (IMF) and the World Bank, to assess economics and finance papers from the two organizations. We aim to keep running these games annually, validating not only scholarly studies but also policy-oriented reports.

"Establishing these relationships need not be time consuming. We’ve found that simply tweeting about our project and speaking about it at conferences can garner interest. That, along with word of mouth after the Oslo workshop, has been sufficient to make our project well known among economists. As a result, all the organizations that we partnered with originally contacted us — rather than the other way round — asking to get involved.

"Other researchers following in our footsteps should be aware that care is needed to avoid conflicts of interest. We receive no money from the collaborations we’re involved in, because taking payment could be viewed as unethical. At the IMF and World Bank games — where people were reproducing and replicating the work of co-workers — we decided to randomly assign participants to a study, allowed them to remain anonymous and prevented participants from assessing studies authored by direct supervisors or friends.

"It is crucial to protect researchers who check papers from career threats — particularly when an effort uncovers major errors. We recommend that an organization or institute mediates communication between the original study’s authors and the replicators, allowing the latter to remain anonymous if they wish. One of us, acting as a representative for the Institute for Replication, serves in this capacity after each replication game.

"We know that receiving an e-mail to say that someone is checking your work can be stressful. So we contact the original authors only after replicators have written up their reports, to avoid causing researchers undue worry while they wait for an effort’s results. Rather than treating the discovery of errors as a ‘gotcha’ moment, which can put authors on the defensive, we acknowledge in our correspondence that all researchers make mistakes. To help make the process collegial, we allow authors to suggest edits to the report, and ask replicators to suggest changes to the authors’ responses.

...

"We think that efforts such as ours that normalize replication will ultimately put pressure on funders and journals to play their part. We are excited to see replication efforts in our fields — and others — continue to expand. Systematic replication has the potential to make correcting science faster. Let the games begin."


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.