Tracing the Footsteps of the Data Reproducibility Crisis

Have you found it challenging to replicate the results of your own or somebody else’s experiments? You are not alone. A member survey conducted by the American Society for Cell Biology (ASCB) revealed that out of 869 respondents, 72% had trouble reproducing the findings of at least one publication1. In a more comprehensive study by the Nature Publishing Group, over 60% and 70% of researchers surveyed in medicine and biology, respectively, reported failure in replicating other researchers’ results2. And out of the 1,576 scientists surveyed in various fields, 90% agreed that there is a reproducibility crisis in scientific literature2.

In case you are wondering, both surveys were conducted between 2014 and 2015, and there is a growing consensus about data reproducibility challenges. But how did we get here?

Beginnings of a Crisis

In 2011, scientists at Bayer HealthCare in Germany published an article in Nature Reviews Drug Discovery, reporting inconsistencies between published data and in-house target validation studies3. Out of the 67 target identification and validation projects they had analyzed for data reproducibility, 43 had shown inconsistencies and had resulted in the termination of projects3. Through this review, the Bayer researchers attempted to raise awareness about the challenges in reproducing published data and called for confirmatory studies prior to investing in downstream drug development projects3.

Close on the heels of Bayer’s report, researchers at Amgen described their attempts at replicating the results of published oncology studies, in a 2012 Nature commentary4. While reporting success at confirming the findings of only 6 out of the 53 landmark publications reviewed, the Amgen scientists outlined recommendations to improve replicability of pre-clinical studies4.

These publications spurred data reproducibility conversations within the biomedical research community, giving way to a wave of initiatives to analyze and address the problem.

Data Reproducibility Gaining Momentum

Reproducibility of research data depends, in part, on the specific materials used in the experiment. But how often are research reagents referenced in sufficient detail? A study found that 54% of resources reported in publications, including model organisms, antibodies, reagents, constructs, and cell lines, were not uniquely identifiable5. In order to promote proper reporting of research materials used, the NIH has recommended that journals expand or eliminate the limits on the length of methods sections17.

When you had challenges reproducing data in your lab, were you able to identify what caused them? In another publication, study design, biological reagents and reference materials, laboratory protocols, and data analysis and reporting were attributed as the four primary causes of experimental irreproducibility6. In effect, an estimated 50% of the U.S. preclinical research budget, or $28 billion a year, was reportedly being spent on data that is not reproducible6.

Based on feedback from researchers in academia, biopharmaceutical companies, journal editors, and funding agency personnel, the Global Biological Standards Institute (GBSI) developed a report highlighting the need for a standards framework in life sciences7.

Changes Instituted by Granting Agencies and Policy Makers

In the face of data reproducibility challenges, government agencies that fund research, including the National Institutes of Health (NIH) and National Science Foundation (NSF) developed action plans to improve the reproducibility of research8,9. The NIH also revised criteria for grant applications8. That means researchers will need to report details of experimental design, biological variables, and authenticate research materials when applying for grants8.

The Academy of Medical Sciences (UK), the German Research Foundation (DFG), and the InterAcademy Partnership for Health (IAP for Health) identified specific activities to improve reproducibility of published data10,11,12.

Recommendations on Use of Standards, Best Practices, and Reagent Validation

Among the organizations championing the development of standards and best practices to improve the reproducibility of biomedical research are:

  • Federation of American Societies for Experimental Biology (FASEB) with recommendations regarding the use of mouse models and antibodies13
  • American Statistical Association’s (ASA) report on statistical analysis best practices when publishing data14
  • Global Biological Standards Institute with recommendations regarding the additional standards in life science research; antibody validation and cell line authentication groups in partnership with life science vendors, academia, industry, and journal publishers15
  • Science Exchange’s efforts at validation of experimental results16

Changes to Publication Guidelines

Journal groups have been revising author instructions and publication policies to encourage scientists to publish data that is robust and replicable. That means important changes regarding reporting of study design, replicates, statistical analyses, reagent identification and validation, are coming your way.

  • The NIH and journal publishing groups including Nature, Science, Cell, Journal of Biological Chemistry, Journal of Cell Biology, and Public Library of Science (PLOS), among others, have developed and endorsed principles and guidelines for reporting preclinical research. These guidelines include statistical analysis, transparency in reporting, data and material sharing, refutations, screening for image-based data (e.g. Western blots) and unique identification of research resources (antibodies, cell lines, animals)17
  • The Center for Open Science (COS) developed Transparency and Openness Promotion (TOP) guidelines framework for journal publishers. Signatories include journal publication groups like AAAS, ASCB, Biomed Central, F1000, Frontiers, Nature, PLOS, Springer, and Wiley, among others18

Emphasis on Training

To train scientists in proper study design and data analysis, the NIH has developed training courses and modules19. A number of universities also offer courses in study design and statistics20.

In the face of revisions to grant applications and publication guidelines, use of standards, reagent validation, and need for consistent training in methods and technique, changes are coming your way. Is your lab prepared? Let us help you get there. See what has changed for publishing Western blot data and get your entire lab trained to generate consistent and reproducible Western blot data at Lambda U™.

References:

  1. How Can Scientists Enhance Rigor in Conducting Basic Research and Reporting Research Results? American Society for Cell Biology. Web. Accessed October 6, 2017.
  2. Baker M. 1,500 scientists lift the lid on reproducibility. Nature (News Feature) 533, 452-454.
  3. Prinz F, Schlange T, Asadullah K. 2011. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov 10, 712-713. doi:10.1038/nrd3439-c1
  4. Begley GC, Ellis LM. 2012. Drug development: Raise standards for preclinical cancer research. Nature 483, 531-533. doi:10.1038/483531a
  5. Vasilevsky NA, Brush MH, Paddock H, et al. On the reproducibility of science: unique identification of research resources in the biomedical literature. Abdullah J, ed. PeerJ. 2013;1:e148. doi:10.7717/peerj.148.
  6. Freedman LP, Cockburn IM, Simcoe TS (2015). The Economics of Reproducibility in Preclinical Research. PLOS Biology 13(6): e1002165.
  7. The Case for Standards in Life Science Research – Seizing Opportunities at a Time of Critical Need. The Global Biological Standards Institute. Web. Accessed November 16, 2017.
  8. Enhancing Reproducibility through Rigor and Transparency. National Institutes of Health. Web. Accessed October 6, 2017.
  9. A Framework for Ongoing and Future National Science Foundation Activities to Improve Reproducibility, Replicability, and Robustness in Funded Research. December 2014. National Science Foundation. Web. Accessed October 6, 2017.
  10. Reproducibility and Reliability of Biomedical Research. The Academy of Medical Sciences (UK). Web. Accessed October 6, 2017.
  11. DFG Statement on the Replicability of Research Results. The Deutsche Forschungsgemeinschaft (DFG – German Research Foundation). Web. Accessed October 6, 2017.
  12. A Call for Action to Improve the Reproducibility of Biomedical Research. The InterAcademy Partnership for Health. Accessed October 6, 2017.
  13. Enhancing Research Reproducibility: Recommendations from the Federation of American Societies for Experimental Biology. Federation of American Societies for Experimental Biology. Web. Accessed October 6, 2017.
  14. Recommendations to Funding Agencies for Supporting Reproducible Research. American Statistical Association. Web. Accessed October 6, 2017.
  15. Reproducibility2020. The Global Biological Standards Institute™. Web. Accessed October 6, 2017.
  16. Validation by the Science Exchange network. Science Exchange. Web. Accessed November 16, 2017.
  17. Principles and Guidelines for Reporting Preclinical Research. Rigor and Reproducibility. National Institutes of Health. Web. Accessed November 16, 2017.
  18. Transparency and Openness Promotion (TOP). Center for Open Science. Web. Accessed October 6, 2017.
  19. Training. Rigor and Reproducibility. National Institutes of Health. Web. Accessed November 16, 2017.
  20. Freedman LP, Venugopalan G and Wisman R. Reproducibility2020: Progress and priorities [version 1; referees: 2 approved]. F1000Research 2017, 6:604 (doi:10.12688/ f1000research.11334.1)