The ‘replication crisis’ could be worse than we thought, new analysis reveals

Science

Products You May Like

The science replication crisis might be worse than we thought: new research reveals that studies with replicated results tend to be cited less often than studies which have failed to replicate.

That’s not to say that these more widely cited studies with unreplicated experiments are necessarily wrong or misleading – but it does mean that, for one reason or another, follow-up research has failed to deliver the same result as the original study, yet it still gets loads of citations.

Thus, based on the new analysis, research that is more interesting and different appears to garner more citations than research with a lot of corroborating evidence.

Behavioral economists Marta Serra-Garcia and Uri Gneezy from the University of California analyzed papers in some of the top psychology, economy, and science journals; they found that studies that failed to replicate since their publication were on average 153 times more likely to be cited than studies that had – and that the influence of these papers is growing over time.

“Existing evidence also shows that experts predict well which papers will be replicated,” write the researchers in their published paper. “Given this prediction, why are non-replicable papers accepted for publication in the first place?”

“A possible answer is that the review team faces a trade-off. When the results are more ‘interesting’, they apply lower standards regarding their reproducibility.”

This replication problem has been a hot topic amongst scientists for several years, but this latest research introduces some interesting new figures. The team behind it looked at the citations of 139 studies that had been included in three major replication projects.

After analysing 20,252 papers citing these studies across a variety of journals, they found that non-replicable papers are, on average, cited 16 times more per year.

In psychology journals, 39 percent of the 100 analyzed studies had been successfully replicated. In economy journals, it was 61 percent of 18 studies, and in the journals Nature and Science, it was 62 percent of 21 studies.

The differences in the prominent Nature and Science journals were the most striking: here, non-replicable papers were cited 300 times more than replicable ones on average. These variations remained even after accounting for the number of authors on a paper, the number of male authors, the details (like location and language) of the experiments and the field in which the paper was published.

Across all the journals and papers, citations of a non-replicable study only mentioned the non-replication 12 percent of the time. However, it’s not just paper authors and scientists who need to be more aware of the problem, the researchers say.

“Interesting or appealing findings are also covered more by media or shared on platforms like Twitter, generating a lot of attention, but that does not make them true,” says Gneezy.

Problematic research can take a long time to put right, too: an infamous, now-retracted 1998 paper linking vaccines and autism turned many against the idea of vaccination as a safe and healthy option. It took 12 years for that particular paper to be retracted, and it has caused lasting damage to public perception of vaccine safety.

Retracting papers can make a difference though, the researchers point out – statistics show that citations of a retracted paper tend to drop by almost 7 percent per year. This is perhaps one way of managing the current replication crisis and making sure that our scientific methods are as thorough as possible.

The authors of the new study acknowledge that academics and journal editors alike feel pressure to publish ‘interesting’ findings that are more likely to attract attention, but want to see research into how the quality of scientific papers can be improved.

“We hope our research encourages readers to be cautious if they read something that is interesting and appealing,” says Serra-Garcia.

“Whenever researchers cite work that is more interesting or has been cited a lot, we hope they will check if replication data is available and what those findings suggest.”

The research has been published in Science Advances.

Products You May Like

Articles You May Like

Panel report on China’s tech developments calls for U.S. action on space and AI
‘The Blair Witch Project’ Producer Says Film, “Is FINALLY being released as we always intended”
Slack Is Reportedly Working On an AI-Powered File Summary Feature
Life Is Strange: Double Exposure coming to Switch tomorrow
Will Fans Go See Both Wicked And Gladiator II? What Paul Mescal Has To Say About Wickediator