Header

Header navigation


The European Open-Access Publishing Platform for Psychology


Main navigation


You are here


Content

Monday 10. April 2017

PsychOpen uses Statcheck tool for quality check

Since the beginning of 2017, Statcheck is run on all research articles accepted for publication on PsychOpen, and authors are asked to correct all inconsistencies (if any) before their articles are published.

Several studies have shown that the prevalence of statistical reporting errors in psychological articles is remarkably high[1]. This may lead to incorrect research conclusions or introduce systematic reporting bias. The European Open-Access Publishing Platform for Psychology, PsychOpen, is aware of this problem and tries to reduce statistical reporting errors in articles before they are published, because, as Armin Günther of PsychOpen says, »a lot of them are slips that can easily be prevented.«

To detect these slips, PsychOpen has started to use Statcheck. Statcheck, developed by Sacha Epskamp and Michèle B. Nuijten, is a tool that searches for specific, often crucial, statistical results in research papers (p-values), recalculates them, and determines whether they are consistent with the reported values. Since the beginning of 2017, Statcheck is run on all research articles accepted for publication on PsychOpen, and authors are asked to correct all inconsistencies (if any) before their articles are published.

It has caused a stir

Statcheck has caused a stir in the psychological science community after Chris Hartgerink, a PhD candidate at Tilburg University, Netherlands, ran Statcheck on more than 50,000 psychological papers and posted individual reports for each article on PubPeer, a Website that provides a forum to discuss scientific research[2]. Not all authors were happy with this procedure, and felt denounced (see blog post on Retraction Watch), especially as some of the alleged inconsistencies proved to be false positives.

Thus, the Executive Board of the German Psychological Society (DGPs) released a statement expressing its concern »about the automatic publication of alleged errors without double checking with the original authors«[3].

Benefical for authors

However, it is widely agreed upon that using Statcheck as a tool for detecting and eliminating statistical reporting errors before they are published will be beneficial for authors as well as for the research community. »I expect that more and more journals will start to use tools such as Statcheck routinely to improve the quality of their articles,« Armin Günther says. »And I am pleased that PsychOpen as a community driven, nonprofit publishing platform is able to quickly pick up these new possibilities.«


 [1] Nuijten, M. B., Hartgerink, C. H. J., van Assen, M. A. L. M., Epskamp, S., & Wicherts, J. M. (2016). The prevalence of statistical reporting errors in psychology (1985–2013). Behavior Research Methods, 48(4), 1205-1226. doi:10.3758/s13428-015-0664-2

[2] Hartgerink, C. H. J. (2016). 688,112 statistical results: Content mining psychology articles for statistical test results. Data, 1(3), Article 14. doi:10.3390/data1030014

[3] Executive Board of the German Psychological Society. (2016, October 20). Statcheck scans full-text articles for statistical errors and publishes the results on pubpeer [Statement]. Retrieved from https://www.dgps.de/stellungnahme_statcheck

Footer

Footer navigation