Fletcher, Samuel C. (2021) How (not) to measure replication. European Journal for Philosophy of Science, 11 (57). pp. 1-27. ISSN 1879-4912
Text
symmetry_prob_unblinded.pdf - Accepted Version Restricted to Repository staff only until 4 June 2022. Download (255kB) |
Abstract
The replicability crisis refers to the apparent failures to replicate both important and typical positive experimental claims in psychological science and biomedicine, failures which have gained increasing attention in the past decade. In order to provide evidence that there is a replicability crisis in the first place, scientists have developed various measures of replication that help quantify or "count" whether one study replicates another. In this nontechnical essay, I critically examine five types of replication measures used in the landmark article "Estimating the reproducibility of psychological science" (OSC 2015) based on the following techniques: subjective assessment, null hypothesis significance testing, comparing effect sizes, comparing the original effect size with the replication confidence interval, and meta-analysis. The first four, I argue, remain unsatisfactory for a variety of conceptual or formal reasons, even taking into account various improvements. By contrast, at least one version of the meta-analytic measure does not suffer from these problems. It differs from the others in rejecting dichotomous conclusions, the assumption that one study replicates another or not simpliciter. I defend it from other recent criticisms, concluding however that it is not a panacea for all the multifarious problems that the crisis has highlighted.
Export/Citation: | EndNote | BibTeX | Dublin Core | ASCII/Text Citation (Chicago) | HTML Citation | OpenURL |
Social Networking: |
Item Type: | Published Article or Volume | ||||||
---|---|---|---|---|---|---|---|
Creators: |
|
||||||
Keywords: | Replicability crisis, Reproducability crisis, Null hypothesis significance testing, Effect size, Confidence interval Meta-analysis | ||||||
Subjects: | General Issues > Data General Issues > Evidence General Issues > Experimentation Specific Sciences > Probability/Statistics Specific Sciences > Psychology |
||||||
Depositing User: | Prof. Samuel C. Fletcher | ||||||
Date Deposited: | 17 Jun 2021 19:28 | ||||||
Last Modified: | 17 Jun 2021 19:28 | ||||||
Item ID: | 19195 | ||||||
Journal or Publication Title: | European Journal for Philosophy of Science | ||||||
Publisher: | Springer | ||||||
Official URL: | https://link.springer.com/article/10.1007%2Fs13194... | ||||||
DOI or Unique Handle: | 10.1007/s13194-021-00377-2 | ||||||
Subjects: | General Issues > Data General Issues > Evidence General Issues > Experimentation Specific Sciences > Probability/Statistics Specific Sciences > Psychology |
||||||
Date: | 3 June 2021 | ||||||
Page Range: | pp. 1-27 | ||||||
Volume: | 11 | ||||||
Number: | 57 | ||||||
ISSN: | 1879-4912 | ||||||
URI: | https://philsci-archive.pitt.edu/id/eprint/19195 |
Monthly Views for the past 3 years
Monthly Downloads for the past 3 years
Plum Analytics
Altmetric.com
Actions (login required)
View Item |