On 11 March 2020, the World Health Organization (WHO) declared the new coronavirus outbreak a pandemic. A novel virus, SARS-CoV-2, causing coronavirus disease 2019 (Covid-19) has brought social life and most of the world economy to an abrupt halt, but health care workers and researchers have been kept extremely busy.
The outbreak triggered a fast and massive response in terms of scientific publications. More than 9,600 papers were indexed in PubMed in the first 119 days since the WHO issued its initial statement regarding a cluster of pneumonia cases in Wuhan, China; see Figure 1A. This is a 25-fold increase when compared to the SARS outbreak in 2003, where 379 papers were published over the same period of time. But it is not just the sheer quantity of papers that stands out. Here, we discuss whether the pandemic can shape the reproducibility of science for the better in three key aspects: open science, rethinking peer-review, and better research.
FIGURE 1 Cumulative number of research papers (A) indexed on PubMed per days since the first WHO statement regarding a novel infectious disease outbreak on 9 January 2020, and (B) preprints indexed on Europe PMC over the same time period. Data and analysis available at osf.io/ma2x5/.
1. Making open science the norm
More than 30 prominent publishing houses agreed to make research papers related to Covid-19, and data supporting the research, freely available and reusable. Such content would often normally be kept behind a paywall. In the midst of a global health crisis, open science is indispensable.
Latest discoveries and medical recommendations must be shared quickly and transparently. Only then can doctors and researchers around the world make informed decisions. Data sharing and crowdsourcing enabled researchers to study protein interactions and identify 69 drugs approved by the US Food and Drug Administration (FDA) that might have the potential to treat Covid patients.1 Nextstrain.org is another open science project for studying the spread of the virus that is based on genomic data. Another open data repository contains epidemiological data to model routes of transmission and risk factors for infection,2 and the Allen Institute for AI provides an open database of over 29,000 scholarly articles wherein artificial intelligence seeks to identify the most relevant knowledge from an overflow of scientific information.
During the pandemic, conflicting information can undermine trust in science. Openness and the sharing of data enable researchers to collaborate, review and reproduce findings, and such activities strengthen trust in science. To move forward, journals and funders should not only continue to encourage data sharing, they must also further strengthen the principles of findable, accessible, interoperable, and reusable (FAIR) data.3 Open science should be the norm. Never before has near real-time analysis of health care data been more necessary.
2. Rethinking peer-review
Publishing scientific work is often a slow process. Studies are carefully evaluated and revised prior to publication, but this can take months. During a pandemic, when time is of the essence, peer-review finds itself under pressure.
Making a study available as a preprint is an effective way to share results quickly, and more than 3,900 Covid-19-related manuscripts have already been deposited on preprint servers, mostly on medRxiv and bioRxiv; see Figure 1B. Certainly, some of these non-peer-reviewed studies will contain methodological flaws. Thus, many preprint servers have issued warnings on their websites that preprints should be treated with caution. But, of course, some preprints will be of great importance. The question is, which ones?
To help identify the most relevant research on Covid-19, some are turning to a platform called Outbreak Science Rapid PREreview, which is dedicated to the rapid review of preprints related to emerging outbreaks.4 Publishers like PLOS, for example, advise their reviewers to use this platform to speed up the review process. The reviews on Outbreak Science Rapid PREreview are open, can be submitted anonymously, and are structured to identify issues with reproducibility. For example, the platform suggests checking whether data are available, whether code is available, and whether sufficient detail is provided to allow replication of a study.
As a consequence of the pandemic there is now a strong desire for “rapid reviews”. However, speed is not the only issue. The timing of a review itself is critical. Reviewing studies before they are conducted ensures proper study design and sound methodology. This is done in journals that support registered reports, a format that reviews study plans before data are collected. A number of journals supporting registered reports are now speeding up peer-review for Covid-19-related research, with the goal of completing reviews of study plans within only 7 days.
Reviewing research is time-consuming and, for the reviewers themselves, not particularly rewarding in terms of career development, where one’s own publications count the most. Yet thousands of Covid-19 preprints need to be carefully evaluated to comply with scientific rigor and best practice. Already, concern has been raised by editors of the Journal of the American Medical Association (JAMA) regarding reporting of the same patients in multiple Covid-19 reports without clear indication.5 Or, clinical prediction models for diagnosis and prognosis related to SARS-CoV-2 infections have been criticized for a high risk of bias when non-representative control patients are used, when patients are excluded, or when models are overfitted.6 It is the job of peer-review to raise such concerns before publication.
This “epidemic” of Covid-19 publications challenges peer-review and has already produced some flawed results that have contributed to the spread of misinformation. Paradoxically, while peer-review is key, few researchers receive formal training in it. How can peer-review be strengthened in the future, so that when the next pandemic arrives, proper scrutiny of papers can be done more speedily and efficiently? In major medical journals it is common practice that statistical reviewers ensure that manuscripts are methodologically sound and meet guidelines for transparent and accurate reporting.7 More young researchers should be trained in methodology and peer-review to spot common statistical pitfalls and misleading interpretations, and they should start with review activities early in their career. Free online courses are available, for example by Nature,8 but universities should also provide such courses. Ultimately, scientists should obtain official recognition for peer-review activities through websites such as Publons.
3. Fast science, better research?
Millions of at-risk individuals hope for an effective treatment for Covid-19, but there are no proven treatments so far. Hydroxychloroquine (a malaria drug) and azithromycin (an antibiotic), investigated in a study, became prominent as potential treatments for Covid-19 because the drugs showed a positive effect on reducing viral load in patients.9 However, the study in question has some notable flaws: the sample of just 36 patients is too small, there was no random allocation to the treatment and placebo groups, and there was no patient-centred outcome (e.g., death, days in intensive care). Thus, researchers should not jump too quickly to conclusions.
Large, multicenter randomized trials – the gold standard to test the efficacy of treatments – are now being undertaken to collect reliable evidence, but they need more time to complete. A trial called Recovery is underway in the UK that involves thousands of patients from 132 different hospitals to test a set of promising candidate drugs, among them hydroxychloroquine. Another trial, Epicos, will study the effect of tenofovir (an HIV drug) and hydroxychloroquine for the prevention of Covid-19 in 4,000 health care workers in Spain. Yet another study, Solidarity, a global trial by the WHO, is testing hydroxychloroquine, lopinavir and ritonavir (two HIV drugs), remdesivir (an Ebola drug), and interferon-beta-1a (a multiple sclerosis drug) in thousands of patients across many countries. These large collaborative efforts to tackle the novel coronavirus disease will ultimately show what works and what does not, and randomized experiments are the best way to do this.
Better research must balance quality and scientific rigor against speed. If trial phases are skipped, there is a danger that any new vaccine against Covid-19 could do more harm than good. Compared to pre-existing drugs that are now being evaluated for repurposing, a new vaccine that has never been used in a large number of humans requires very careful evaluation. No quality compromises are acceptable here, but modern pathways for vaccine development, such as conditional marketing authorisation (CMA), may help to balance benefits and risks. In the case of Ebola, CMA provided early access to an urgently needed vaccine.
Governments and their advisors require high-quality summaries of “what is known” by means of synthesized information based on systematic reviews and meta-analyses. Such reviews must avoid the misinterpretations of statistically non-significant results as evidence for no effect. For example, when a single study gives inconclusive evidence about the effect of a drug, the conclusion should be that there is “no or little evidence of an effect” rather than “evidence of no effect”.10 The recognition that non-significant findings may give evidential support for a hypothesis in a meta-analysis is crucial to accelerate the emergence of reliable insights.
Faster, better, stronger?
The scientific world moves at a much faster pace now. The Covid-19 pandemic has produced a flood of academic papers, but how well does science during a pandemic align with good research practice and reproducibility? It is too early to say, but the risk of fast science is that quality is at stake. To be convincing, science must be based on good research methodology, but pressure for fast results can make it harder for scientists to adhere to good practices.
To support researchers, courses in good research practice should be widely adopted to address highly relevant topics like study design, open science, statistics, and reproducibility. Research institutions are well advised to train their next generation of young researchers, and preparation must also include the training of teams for rapid synthesis of relevant evidence. We cannot be prepared enough for the next global health crisis.
About the authors
Simon Schwab is a postdoctoral fellow in the Center for Reproducible Science at the University of Zurich (UZH).
Leonhard Held is a professor of biostatistics at UZH and director of the Center for Reproducible Science at UZH.
References
- Gordon, D. E., Jang, G. M., Bouhaddou, M., et al. (2020) A SARS-CoV-2-human protein-protein interaction map reveals drug targets and potential drug-repurposing. bioRxiv. doi:10.1101/2020.03.22.002386. ^
- Xu, B., Kraemer, M. U. G. and Open COVID-19 Data Curation Group (2020) Open access epidemiological data from the COVID-19 outbreak. Lancet Infect Dis. doi:10.1016/S1473-3099(20)30119-5 ^
- Sim, I., Stebbins, M., Bierer, B. E., et al. (2020) Time for NIH to lead on data sharing. Science, 367(6484), 1308–1309. doi:10.1126/science.aba4456 ^
- Johansson, M. A. and Saderi, D. (2020) Open peer-review platform for COVID-19 preprints. Nature, 579(7797), 29. doi:10.1038/d41586-020-00613-4 ^
- Bauchner, H., Golub, R. M. and Zylke, J. (2020) Editorial concern-possible reporting of the same patients with COVID-19 in different reports. JAMA. doi:10.1001/jama.2020.3980 ^
- Wynants, L., Van Calster, B., Bonten, M. M. J., et al. (2020) Prediction models for diagnosis and prognosis of Covid-19 infection: systematic review and critical appraisal. BMJ, 369. doi:10.1136/bmj.m1328 ^
- Moher, D. (2018) Reporting guidelines: doing better for readers. BMC Medicine, 16(1), 233. doi:10.1186/s12916-018-1226-0 ^
- Glezerson, B. A. and Bryson, G. L. (2019) Focus on peer review: an online peer review course by Nature Masterclasses. Canadian Journal of Anesthesia, 66(3), 348–349. doi: 10.1007/s12630-018-1254-4 ^
- Gautret, P., Lagier, J.-C., Parola, P., et al. (2020) Hydroxychloroquine and azithromycin as a treatment of COVID-19: results of an open-label non-randomized clinical trial. Int J Antimicrob Agents, 105949. doi:10.1016/j.ijantimicag.2020.105949 ^
- Altman, D. G. and Bland, J. M. (1995) Statistics notes: absence of evidence is not evidence of absence. BMJ, 311(7003):485 ^