Abstract
An estimated 82–89% of ecological research and 85% of medical research has limited or no value to the end user because of various inefficiencies. We argue that registration and registered reports can enhance the quality and impact of ecological research. Drawing on evidence from other fields, chiefly medicine, we support our claim that registration can reduce research waste. However, increasing registration rates, quality and impact will be very slow without coordinated effort of funders, publishers and research institutions. We therefore call on them to facilitate the adoption of registration by providing adequate support. We outline several aspects to be considered when designing a registration system that would best serve the field of ecology. To further inform the development of such a system, we call for more research to identify the causes of low registration rates in ecology. We suggest short- and long-term actions to bolster registration and reduce research waste.
Main
The extent of avoidable waste in ecological research is alarmingly high (estimated to be 82–89%1 on the basis of 10,464 ecological studies). This wasted research encompasses all research with limited or no value to end users (Box 1). Consequently, valuable information that could otherwise advance knowledge, guide future research, and inform interventions and policies is lost. Research waste is particularly worrying in ecology, a field that has a central role in solving global challenges and reaching the Sustainable Development Goals2. Research waste has also been estimated in health research3, with 85% of research being wasted (the details are provided in Table 1).
Three main components of research waste are unpublished research (estimated at 45% of research in ecology1), low-quality studies (estimated at 67% of studies in ecology1) and under-reported results in published studies (estimated at 41% of results in ecology1). Table 1 contains details on how these estimates were obtained.
Many pathways to reduce waste include open science practices1. For example, published research with improper analysis can be re-analysed if the data are open. Reporting guidelines (for example, PRISMA-EcoEvo4 and ROSES5) can also reduce waste, as they ensure sufficient reporting of results and methods. Many open-science-related changes in ecology have gained substantial visibility to researchers, funders and publishers (for example, increases in funders’ and publishers’ policies on sharing data6 and code7). However, another practice that can reduce waste, but has received less attention in ecology and is rarely used, is the registration of studies.
In this Perspective, we argue that study registration (both preregistration8,9,10 and registered reports11,12) could substantially reduce research waste in ecology (and other fields). Preregistration and registered reports share a common principle that the research plan is pre-specified prior to conducting the research (Fig. 1). Registration could reduce waste through several mechanisms, outlined in Table 1 and discussed later. Registration also increases transparency and facilitates the identification of justified and unjustified modifications to the original study plan and reporting. Here we investigate the existing evidence (from any field) of the benefits of registration to reducing research waste. The evidence largely comes from medicine, a field where registries of clinical trials have been in substantial use since at least 200013, and from psychology, which initiated registered reports in 201311.
a, In preregistration, a study protocol is submitted to a registry before data collection or data analysis (if working with already collected data). The protocol is commonly not peer reviewed. After completion, the preregistered study can be submitted to a journal, and its results can be added to a registry (regardless of whether a study is published or not). b, In registered reports, the study protocol is submitted to a journal and peer reviewed (stage 1 peer review). Any feedback is incorporated in the revised protocol. The completed study is then submitted for a stage 2 peer review. This figure is adapted from Center for Open Science (https://www.cos.io/initiatives/registered-reports), CC BY 4.0.
Preregistration and registered reports in ecology
The research plan can be specified prior to conducting research. This is commonly done via one of two related processes: (1) preregistration, where the protocol is posted in a registry (repository) independent of its eventual publication, and (2) registered reports, where the protocol is peer-reviewed by a journal that will eventually publish the results (Fig. 1). Note that in medicine the term ‘registration’ denotes preregistration. However, in the Perspective, we use term ‘registration’ to encompass both preregistration and registered reports.
Preregistration is a publicly documented research plan (for example, questions, hypotheses, data collection plan and analysis plan) that is registered before data collection starts, before viewing the data if working with pre-existing data or before research results are known8,9,10. This can be done by storing the study plan in a (commonly read-only) public repository, such as Open Science Framework (OSF)14 (Box 1) Registries (https://help.osf.io/article/145-preregistration) or the National Library of Medicine’s Clinical Trials Registry (https://clinicaltrials.gov/). Researchers can make preregistration publicly accessible immediately or after an embargo period. Although ideally preregistration should be prospective, it is sometimes done retrospectively (that is, after some part of the research has already been conducted). As pointed out by Hardwicke and Wagenmakers10, preregistration “reduces the risk of bias by encouraging outcome-independent decision-making and increases transparency, enabling others to assess the risk of bias and calibrate their confidence in research outcomes”. Preregistration is probably uncommon in ecology: only 3% of systematic reviews and meta-analyses published in ecology and evolutionary biology have been preregistered4 (the registration rate of the primary literature has not been estimated).
The registered report11,12 is a publication format where a study’s design and methods undergo stage 1 peer review (Box 1) by a journal before data collection, or data access or analysis if working on already collected data. Upon passing stage 1 peer review and completion of the research, the final article with results and discussion undergoes stage 2 peer review (Box 1). The acceptance of a registered report depends on the relevance of the research topic, the thorough development of the research questions/hypotheses and the robustness of the methodological approach, rather than the results obtained. This format promotes methodological rigour, helps to reduce publication bias and enhances transparency15. An increasing number of journals publishing ecological and evolutionary biology research are introducing registered reports. As of August 2023, 24 such journals offer registered reports (dataset 1 from ref. 16), including Nature Ecology & Evolution, Ecology and Evolution (Wiley), Ecological Solutions and Evidence (Wiley) and PLoS ONE, among others. However, only a few registered reports have been published in ecology (20 between 2016 and the end of 2023; dataset 1 in ref. 16).
Decreasing research waste via registration
Registration could reduce several components of research waste and improve study quality, thereby bolstering the robustness and reliability of results (Table 1). To evaluate the evidence for and against these claims, we conducted an exploratory survey (the details are provided in the Supplementary Information) to gather meta-studies (Box 1), published in any field, that quantified the impact of preregistration or registered reports on the methodological quality of research, on the completeness of reporting or on the features of obtained results (effects). We identified 26 (refs. 17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42) such meta-studies (n = 19 from medicine, n = 6 from psychology and n = 1 covering both fields). The majority (n = 21) compared published preregistered studies with non-preregistered ones, and five compared registered reports with standard literature (the details are provided in dataset 2 from ref. 16). Given our search terms (Supplementary Information), any meta-studies we might have missed should not be biased towards studies with certain conclusions. We did not conduct a critical appraisal of the included meta-studies (assessing the risk of bias and potential confounders). The reference list we obtained could serve as a starting point for a systematic review of the topic.
Below, we discuss the potential benefits of registration practices and the evidence for these benefits. We also suggest some additional features that a registration system should offer for some of these benefits to be best realized.
Improve study planning and prevent duplication
It is estimated that 67% of studies in ecology are poorly planned1. Only one meta-study examined the differences in methodological quality between registered reports and standard literature: registered reports showed a more rigorous methodology, a higher quality of methodology and a better alignment between the research question and the methodology42 (see the study for details on how these variables were obtained). This is probably the outcome of the quality checks via stage 1 peer review and consequent revisions to the work plan before the study is conducted. However, preregistrations are rarely reviewed (with rare exceptions such as the Australian New Zealand Clinical Trials Registry: https://www.anzctr.org.au/Default.aspx and https://www.anzctr.org.au/docs/registration%20process%20flow%20chart.pdf). We thus advocate for enabling quality checks of preregistered work by statisticians or other relevant experts.
Even without external review, preregistration can improve study design. One reason is the use of preregistration templates that outline important elements of a robust study design (for example, randomization and blinding). All the meta-studies on different aspects of quality of clinical trials show that, compared with non-registered studies, preregistered published trials had better methodological quality (one meta-study17, based on PEDro scale43), lower risk of bias (six meta-studies18,19,20,21,22,23) and larger sample sizes (nine meta-studies17,18,19,20,24,25,26,27,28; the details are provided in dataset 2 of ref. 16). For example, Won et al.18 found that prospectively registered studies had a lower risk of bias in random sequence generation, allocation concealment and selective outcome reporting.
Finally, if registries of studies were made searchable (as we discuss later), they could be used to avoid research duplication.
Reduce questionable research practices
Registration can reduce questionable research practices (QRPs) such as P-hacking (Box 1). The results of published studies that were preregistered had smaller effect sizes (found in five of five meta-studies21,24,29,30,31on the topic), less often supported the hypothesis (found in four of five meta-studies26,27,32,33,34) and had lower statistical significance (found in one of one meta-study35) than published studies that were not preregistered (the details are provided in dataset 2 of ref. 16). For example, Schäfer and Schwarz31 found that preregistered studies in psychology (n = 93) reported smaller effects (median r = 0.16) than non-preregistered studies (n = 900, median r = 0.36). Similar trends were found in four meta-studies38,39,40,41 that compared results reported in registered reports with the results in standard literature. For instance, Brohmer et al.38 found that published studies reported larger effects (Hedges’ g = 0.42) than unpublished studies and published registered reports (Hedges’ g = −0.01).
Reduce publication bias and increase the availability of research results
An estimated 45% of ecological research is never published1. Reasons include the lack of time, low-quality work that is consequently not publishable or the publication of biased sets of results (usually those that are statistically significant)44,45. Registration could reduce waste caused by unpublished research and specifically counter publication bias. Registered reports do exactly this—the results do not influence the acceptance or publication of the manuscript, and all the pre-specified results must be reported. Indeed, Scheel et al.39 found that in psychology 96% of the standard literature (n = 152) had positive results, whereas only 44% of the registered reports (n = 71) had positive results, demonstrating the potential impact of registered reports in reducing publication bias (and QRPs).
Although not commonly done, registries of preregistered studies could also publish the results (those specified during registration and any additional ones) of the registered study, regardless of the study’s publication in a journal. These results could be made accessible via the registry where the study was preregistered. For clinical trials, result publication is often required by international policies, journals and others46, leading to potentially more results being reported in the registries than published via journals. For example, out of 905 preregistered trials at ClinicalTrials.gov that were later published, 72% report their primary outcomes in the registry but only 22% do so in their published form47. Meta-studies from our exploratory survey17,18,22,48,49,50,51,52,53 (n = 9) detected that many clinical trials show discrepancies between outcomes and results reported in published studies and their entry in the registry. When possible to assess, these discrepancies were in favour of statistically significant results (in eight of nine meta-studies; dataset 3 in ref. 16).
A similar approach, where the results of preregistered studies would be available via registries, could be applied in ecology, increasing the availability of results and the potential impact of studies not published in journals. We note that the results in ecology come from a much larger variety of study designs than those in medicine. Thus, the reporting of results in registries could follow basic reporting guidelines and standards, but otherwise be of a more free format.
Reduce the issue of under-reported results
Registration could reduce issues with under-reported results (for example, reporting only a P value without an associated effect size or sample size), estimated at 41% of results in ecology1. This is because preregistration templates and guidelines clearly outline important components of the methodological process (such as sample size) that must be specified during preregistration. Indeed, medical journal articles that were preregistered reported methodological details more completely than unregistered articles25,26,27,36 (dataset 2 in ref. 16). However, none of the meta-studies obtained via our exploratory survey focused on the completeness of result reporting. We hypothesize that, if registries allowed for result reporting (with some basic standards), reporting in the published version of the study would improve as well.
Increase the availability of data and software
Although currently not commonly done, preregistration and registered reports could also include a short section on data and software management. This would probably improve the availability of data and software, which would in turn eliminate some of the research waste. First, the publication of raw data would eliminate the waste caused by studies that never publish any results, because the data could be used elsewhere. Second, the publication of raw data would reduce waste due to incorrect analysis (which occurs in an estimated 47.1% of studies in ecology1) because the correct analysis could be applied later (that is, after the paper was published). Third, raw data could be used to understand under-reported results (for example, if an effect size published in a study lacks the sample size). Data processing and analytical code would further improve the completeness of the results (for example, by reproducing an effect size for which only a P value has been supplied in the publication).
Further benefits to researchers are discussed elsewhere and include reduced workload down the line (for example, when reporting the study methodology), greater transparency, searching and refining ideas, and promoting trust within the community54,55,56,57. Registration could potentiate sounder funding allocation as well as savings in financial, human and time resources (for example, see Wieschowski et al.54). Although costs of registration exist (for example, time investment in creating registration), the benefits should outweigh the costs, as found in a survey of 355 researchers58. Furthermore, and as we discuss later, funders and publishers could greatly reduce the cost of registration to researchers.
Potential issues
Study preregistration is neither a ‘magic bullet’ nor a quick solution to increasing research quality and decreasing research waste. In the decades since Simes59 argued the case for universal preregistration of clinical trials, the needed infrastructure and processes have gradually been put in place. However, progress towards all trials being preregistered has been slow. This led to the AllTrials campaign, which launched in 2013 to have “All trials registered; all results reported” (https://www.alltrials.net/). Many ethics committees, funders and publishers require trial preregistration, and that would not have been possible without the infrastructure and culture change.
However, policies do not guarantee that clinical trials will be prospectively preregistered19,60, and preregistration does not necessarily translate into publications free of selective reporting33,60,61. Although preregistration of clinical trials is getting closer to 100%, we are still a way off from all results being reported. For decades, the reporting lingered at around 50%62, but recent analyses show improvement. For example, of 1,970 trial registrations on ANZCTR, only 541 (27%) remained unpublished 10 to 14 years later, and the proportion of published trials increased by 7% from 2007 to 201163. Preregistrations are sometimes of low quality17,64, and published studies often differ from their preregistered versions in methodology65, outcome measures61 and result reporting66,67. This outcome measure and result reporting bias is often (but not always—for example, see ref. 61) in favour of statistically significant results, larger effects and effects that support hypotheses that were tested17,18,22,48,49,50,52,53 (dataset 3 from ref. 16). Although preregistration cannot eliminate publication bias53,68 or QRPs, registration will make the underlying process (planned data collection and analyses, and any deviation from these) transparent, and thus aid better interpretation and evaluation of study results10.
Developing and maintaining an efficient preregistration system will be costly. Although data on the costs and benefits of preregistration are yet to be properly collected and evaluated, we trust that the benefits should outweigh the costs. For example, the estimated annual waste of US $170 billion invested in medical research69 is much larger than the 2007 budget for ClinicalTrials.gov (US $3 million70). Other issues that require further discussion include research-field-appropriate preregistration procedures and content, uniformity of registrations (for example, does the registering authority have equal criteria for all types of studies?), and procedures to ensure the timely review of preregistration and registered reports.
Researchers have concerns regarding preregistration and registered reports, such as potential limitations on exploratory research, whether the approach will stifle innovation and creativity, and the time and effort required to complete the preregistration process11,54,57. These concerns could partly be addressed by set registration standards (for example, what to register), better support for registration and an appropriate set of incentives, all of which would support a change in research culture where registration would be a norm rather than an exception.
Supporting registration in ecology
To enable more registration and better-quality registration in ecology and boost the use of the information contained in the registries (for example, the use of results of registered studies), funders, publishers and research institutes should aim for several long-term goals and short-term activities. In a long run, a strong registration system should involve an adequate infrastructure, guidelines and templates, an adequate support system, actionable incentives and mandates, and an ongoing evaluation of the different aspects of the registration system (meta-studies on barriers to registration, evaluation of the effectiveness of registration and similar). We expand on these below and suggest discussion points on several aspects of the registration system that would best support ecological research. These include some existing aspects—for example, registering data collection design, calculating power and alternatives71 (as in medicine), and specifying the data analysis plan (as often required in psychology). However, we also suggest considering several novel aspects (for example, alternative ways of registration, adding data and software management plans to registrations, and facilitated search for registered studies and their results).
The infrastructure for registration
We suggest several considerations to be discussed (via, for example, forums, workshops and surveys) with the ecological research community before deciding on the best features of the infrastructure for registration. First, should the infrastructure be based on the existing platforms (such as OSF) for preregistration, or should a separate registry be developed? If the latter, would it be valuable for a registry to capture a broader set of fields, such as environmental sciences (which would also probably facilitate searches for registered studies)? Second, would researchers be more likely to preregister their studies if the infrastructure would support alternative ways of registration? For example, a researcher could first register their hypothesis and data collection plan, and register the analysis plan later. We have already informally discussed such a registration form (that is, modular registration) at a Society for Open, Reliable, and Transparent Ecology and Evolutionary Biology (SORTEE) conference in 2021, and it was favourably received by the attendees. However, we call for further evaluation of this and other options. For example, although modular registration could increase the use of preregistration, it could also decrease the impact of registration on research quality. Of relevance, platforms such as Octopus (https://www.octopus.ac/) and ResearchEquals (https://www.researchequals.com/) let researchers publish their work in a modular manner, although their focus is not registration.
Such infrastructure should enable submitting the results of preregistered studies to a registry and set standards for result reporting. For instance, the ClinicalTrials.gov Protocol Registration and Results System is a web-based data entry system that enables users to submit results information for a registered study (https://classic.clinicaltrials.gov/ct2/manage-recs/submit-study). Funders or institutions should then require the results of studies to be published in a registry (especially those results that are not published via journal), which would increase the availability of results and counter publication bias. Furthermore, the infrastructure should provide a user-friendly search interface and expose metadata (Box 1) on registered studies to search engines and platforms. In this way, registries would enable the search of registered studies by third parties and the identification of work that has been conducted but remains unpublished via traditional routes. To our knowledge, such a system that is connected to search platforms has not been developed yet.
Introducing registered reports
Journals that already accept registered reports in ecology (dataset 1 from ref. 16) can be approached to share their experience. SORTEE has a journal liaison officer who can answer any question editors or others might have. We have checked (on 15 July 2023) the websites of 24 journals that offer registered reports for ecological research and detected that only a few explicitly state what type of contribution they accept in this format (systematic review, empirical work and so on). Most journals have non-explicit statements that might be interpreted as the journals accepting experimental work only (dataset 1 in ref. 16). We therefore call journals to be more explicit about the type of research they accept as registered reports.
Registration requirements and templates
Registration requirements and templates that would be implemented by the registries should be developed and promoted. The research community should discuss whether the existing solutions (for example, those available on OSF) meet the needs of ecological research, or whether new solutions are needed. Some examples of the minimum information for a registered study are the World Health Organization’s Trial Registration Data Set for clinical trials (https://www.who.int/clinical-trials-registry-platform/network/who-data-set) and Preregistration Standards for Quantitative Research in Psychology (https://prereg-psych.org/index.php/rrp/templates), created by the joint efforts of the multi-society Preregistration Task Force (https://leibniz-psychology.org/en/news/detail/internationale-zusammenarbeit-prae-registrierungsvorlage-fuer-die-quantitative-forschung-in-der-psych-1). Furthermore, we propose that the community should discuss whether such templates should include data and software management sections, as data and software are integral parts of research and are important research output. Journals can raise awareness of registration templates through editorials and special issues that would promote these templates (for example, as seen in the journal Evidence-Based Toxicology72).
Creating a support system for implementing registration
Dedicated teams of experts could support researchers in registering their studies and check or review preregistered studies for any design issues. This would improve the quality of the study design before the study is conducted, potentially eliminating almost 70% of research waste attributed to study-design issues in ecology1. Such teams could be established at the level of a funder (all funded work is quality-checked), research institution (all research from an institution is checked), country (for example, institutes that promote rigour and quality of research) or research discipline (international or national). The input on study-design quality should be done quickly to avoid postponing the start of the study. The support system should also include education and training of students, researchers and support stuff in registration practices.
Funders can further support registered reports by providing funding either for the publication of registered reports or for research projects that aim to be published as registered reports. For example, grant programmes of Cancer Research UK and Templeton World support research that will be published as a registered report (https://www.cancerresearchuk.org/funding-for-researchers/how-we-deliver-research/positive-research-culture/registered-reports and https://www.templetonworldcharity.org/projects-database/0593).
Incentives and mandates can be introduced through changes in policies, as done by many in medicine. For example, the International Committee of Medical Journal Editors introduced trial registration policy that then led to the implementation of laws and policies in the USA and internationally that expanded mandatory prospective trial registration73,74. A system for checking whether policies are followed should also be established. For example, among 14 medical research funders in Europe that require prospective trial registration, only some monitor whether trials are indeed registered (9 funders) or whether results are made public (8 funders)75. Text mining and other AI-driven solutions could be used for such monitoring. For instance, PLoS and DataSeer have developed such a tool to monitor Open Science Indicators in PLoS journals76.
Some of the existing solutions to incentivize registration involve giving preregistration badges (Box 1) or providing higher weight to the value of registered reports or preregistered studies than the value of those that were not preregistered when making decisions about promotions, grant acquisition and so on. However, these incentives should be considered within the broader set of measures, and their impact and appropriateness should be further discussed, as argued elsewhere77.
Meta-research projects on preregistration and registered reports in ecology should be funded. These projects could, for example, systematically evaluate the reasons behind low registration rates in ecology, or the effectiveness of policies, mandates and incentives in increasing the quantity and quality of registration. They could also study the effectiveness of registration in decreasing research waste and increasing the robustness of ecological studies.
In the short term, we call for (1) forums and consortia to discuss the proposed goals; (2) journal editorials and series on topics of registration, including evaluations of best practices; and (3) funding for projects to improve registration and evaluate its effects. The discussion could be nucleated by existing societies with a shared agenda (for example, SORTEE and AIMOS) through dedicated actions such as workshops, round-tables or seminars. Publishers, funders and research institutions should work together with the research in defining and implementing best practices, including metrics to track progress. The effectiveness of these practices can then be quantified via meta-research (for example, what policies work, how well they work and whether registration improves study quality). In Box 2, we provide an example of the collaborative development of two reporting checklists (CONSORT and SPIRIT) that “improved clinical trial design, conduct and reporting”78.
Conclusion
Our Perspective is aimed at setting a fertile ground for an open dialogue about the role of registration in ecology and the best ways to support it. We hope that the realization that more than 80% of ecological research is wasted provides a strong incentive for funders, publishers and research institutions to start and continue supporting registration in ecology. Although our Perspective focuses on registration, it is important that funders, publishers and research institutions support researchers in adopting other open-science practices and principles, as these are also essential to increasing the quality of research. A nice overview of these can be found in Davidson et al.79. We note that many of the components of registration we discuss in this manuscript, such as reporting templates, conducting a power analysis (or alternatives), review of statistical approaches, or data and software management, can (and should) be implemented independently of preregistration. In summary, by working together, we can make ecological research more robust and impactful.
Data availability
The datasets referenced in this Perspective are available in the Zenodo repository (https://doi.org/10.5281/zenodo.10955469)16.
References
Purgar, M., Klanjscek, T. & Culina, A. Quantifying research waste in ecology. Nat. Ecol. Evol. 6, 1390–1397 (2022).
Transforming Our World: The 2030 Agenda for Sustainable Development (United Nations, 2015).
Chalmers, I. & Glasziou, P. Avoidable waste in the production and reporting of research evidence. Lancet 374, 86–89 (2009).
O’Dea, R. E. et al. Preferred reporting items for systematic reviews and meta‐analyses in ecology and evolutionary biology: a PRISMA extension. Biol. Rev. 96, 1695–1722 (2021).
Haddaway, N. R., Macura, B., Whaley, P. & Pullin, A. S. ROSES RepOrting standards for Systematic Evidence Syntheses: pro forma, flow-diagram and descriptive summary of the plan and conduct of environmental systematic reviews and systematic maps. Environ. Evid. 7, 7 (2018).
Berberi, I. & Roche, D. G. No evidence that mandatory open data policies increase error correction. Nat. Ecol. Evol. 6, 1630–1633 (2022).
Culina, A., van den Berg, I., Evans, S. & Sánchez-Tójar, A. Low availability of code in ecology: a call for urgent action. PLoS Biol. 18, e3000763 (2020).
Nosek, B. A., Ebersole, C. R., DeHaven, A. C. & Mellor, D. T. The preregistration revolution. Proc. Natl Acad. Sci. USA 115, 2600–2606 (2018).
Rice, D. B. & Moher, D. Curtailing the use of preregistration: a misused term. Perspect. Psychol. Sci. 14, 1105–1108 (2019).
Hardwicke, T. E. & Wagenmakers, E.-J. Reducing bias, increasing transparency and calibrating confidence with preregistration. Nat. Hum. Behav. 7, 15–26 (2023).
Chambers, C. D. & Tzavella, L. The past, present and future of registered reports. Nat. Hum. Behav. 6, 29–42 (2022).
Henderson, E. L. & Chambers, C. D. Ten simple rules for writing a Registered Report. PLoS Comput. Biol. 18, e1010571 (2022).
Zarin, D. A., Tse, T., Williams, R. J., Califf, R. M. & Ide, N. C. The ClinicalTrials.gov results database—update and key issues. N. Engl. J. Med. 364, 852–860 (2011).
OSF https://www.cos.io/products/osf (Center for Open Science, accessed 30 June 2023).
Chambers, C. D., Feredoes, E., Muthukumaraswamy, S. D. & Etchells, P. Instead of ‘playing the game’ it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neurosci. 1, 4–17 (2014).
Purgar, M. et al. Supporting study registration to reduce research waste. Zenodo https://doi.org/10.5281/zenodo.10955469 (2024).
Pinto, R. Z. et al. Many randomized trials of physical therapy interventions are not adequately registered: a survey of 200 published trials. Phys. Ther. 93, 299–309 (2013).
Won, J., Kim, S., Bae, I. & Lee, H. Trial registration as a safeguard against outcome reporting bias and spin? A case study of randomized controlled trials of acupuncture. PLoS ONE 14, e0223305 (2019).
Farquhar, C. M. et al. Clinical trial registration was not an indicator for low risk of bias. J. Clin. Epidemiol. 84, 47–53 (2017).
Tan, A. C. et al. Prevalence of trial registration varies by study characteristics and risk of bias. J. Clin. Epidemiol. 113, 64–74 (2019).
Papageorgiou, S. N., Xavier, G. M., Cobourne, M. T. & Eliades, T. Registered trials report less beneficial treatment effects than unregistered ones: a meta-epidemiological study in orthodontics. J. Clin. Epidemiol. 100, 44–52 (2018).
Riemer, M. et al. Trial registration and selective outcome reporting in 585 clinical trials investigating drugs for prevention of postoperative nausea and vomiting. BMC Anesthesiol. 21, 249 (2021).
Hamm, M. P. et al. A descriptive analysis of a representative sample of pediatric randomized controlled trials published in 2007. BMC Pediatr. 10, 96 (2010).
Dechartres, A., Ravaud, P., Atal, I., Riveros, C. & Boutron, I. Association between trial registration and treatment effect estimates: a meta-epidemiological study. BMC Med. 14, 100 (2016).
Shaw, R., Ni, M., Pillar, M. & Tejani, A. M. Are antidepressant and antipsychotic drug trials registered? A cross-sectional analysis of registration and reporting of methodologic characteristics. Account. Res. 25, 301–309 (2018).
Odutayo, A. et al. Association between trial registration and positive study findings: cross sectional study (Epidemiological Study of Randomized Trials—ESORT). Br. Med. J. 356, j917 (2017).
Emdin, C. et al. Association of cardiovascular trial registration with positive study findings: Epidemiological Study of Randomized Trials (ESORT). JAMA Intern. Med. 175, 304–307 (2015).
Trinquart, L., Dunn, A. G. & Bourgeois, F. T. Registration of published randomized trials: a systematic review and meta-analysis. BMC Med. 16, 173 (2018).
Gartlehner, G. et al. Assessing the magnitude of reporting bias in trials of homeopathy: a cross-sectional study and meta-analysis. BMJ Evid. Based Med. 27, 345–351 (2022).
Kvarven, A., Strømland, E. & Johannesson, M. Comparing meta-analyses and preregistered multiple-laboratory replication projects. Nat. Hum. Behav. 4, 423–434 (2020).
Schäfer, T. & Schwarz, M. A. The meaningfulness of effect sizes in psychological research: differences between sub-disciplines and the impact of potential biases. Front. Psychol. 10, 813 (2019).
Rasmussen, N., Lee, K. & Bero, L. Association of trial registration with the results and conclusions of published trials of new oncology drugs. Trials 10, 116 (2009).
Gopal, A. D. et al. Adherence to the International Committee of Medical Journal Editors’ (ICMJE) prospective registration policy and implications for outcome integrity: a cross-sectional analysis of trials published in high-impact specialty society journals. Trials 19, 448 (2018).
Kaplan, R. M. & Irvin, V. L. Likelihood of null effects of large NHLBI clinical trials has increased over time. PLoS ONE 10, e0132382 (2015).
Seehra, J., Khraishi, H. & Pandis, N. Studies with statistically significant effect estimates are more frequently published compared to non-significant estimates in oral health journals. BMC Med. Res. Methodol. 23, 6 (2023).
Tharyan, P., George, A. T., Kirubakaran, R. & Barnabas, J. P. Reporting of methods was better in the Clinical Trials Registry-India than in Indian journal publications. J. Clin. Epidemiol. 66, 10–22 (2013).
Ye, Q. M. et al. Quality assessment and its influencing factors of lung cancer clinical research registration: a cross-sectional analysis. J. Thorac. Dis. 14, 3471–3487 (2022).
Brohmer, H., Eckerstorfer, L. V., van Aert, R. C. M. & Corcoran, K. Do behavioral observations make people catch the goal? A meta-analysis on goal contagion. Int. Rev. Soc. Psychol. 34, 3 (2021).
Scheel, A. M., Schijen, M. R. M. J. & Lakens, D. An excess of positive results: comparing the standard psychology literature with registered reports. Adv. Methods Pract. Psychol. Sci. 4, 1–12 (2021).
Wiseman, R., Watt, C. & Kornbrot, D. Registered reports: an early example and analysis. PeerJ 7, e6232 (2019).
Allen, C. & Mehler, D. M. A. Open science challenges, benefits and tips in early career and beyond. PLoS Biol. 17, e3000246 (2019).
Soderberg, C. K. et al. Initial evidence of research quality of registered reports compared with the standard publishing model. Nat. Hum. Behav. 5, 990–997 (2021).
Maher, C. G., Sherrington, C., Herbert, R. D., Moseley, A. M. & Elkins, M. Reliability of the PEDro scale for rating quality of randomized controlled trials. Phys. Ther. 83, 713–721 (2003).
Koricheva, J. Non-significant results in ecology: a burden or a blessing in disguise? Oikos 102, 397–401 (2003).
Brlík, V. et al. Weak effects of geolocators on small birds: a meta‐analysis controlled for phylogeny and publication bias. J. Anim. Ecol. 89, 207–220 (2020).
Why Should I Register and Submit Results? ClinicalTrials.gov and History, Policies, and Laws https://classic.clinicaltrials.gov/ct2/manage-recs/background (ClinicalTrials.gov, accessed 5 July 2023).
Williams, R. J., Tse, T., DiPiazza, K. & Zarin, D. A. Terminated trials in the ClinicalTrials.gov results database: evaluation of availability of primary outcome data and reasons for termination. PLoS ONE 10, e0127242 (2015).
Grégory, J., Créquit, P., Vilgrain, V., Boutron, I. & Ronot, M. Published trials of TACE for HCC are often not registered and subject to outcome reporting bias. JHEP Rep. 3, 100196 (2020).
Killeen, S., Sourallous, P., Hunter, I. A., Hartley, J. E. & Grady, H. L. Registration rates, adequacy of registration, and a comparison of registered and published primary outcomes in randomized controlled trials published in surgery journals. Ann. Surg. 259, 193–196 (2014).
Roest, A. M. et al. Reporting bias in clinical trials investigating the efficacy of second-generation antidepressants in the treatment of anxiety disorders: a report of 2 meta-analyses. JAMA Psychiatry 72, 500–510 (2015).
Turner, E. H., Knoepflmacher, D. & Shapley, L. Publication bias in antipsychotic trials: an analysis of efficacy comparing the published literature to the US Food and Drug Administration database. PLoS Med. 9, e1001189 (2012).
Su, C. X. et al. Empirical evidence for outcome reporting bias in randomized clinical trials of acupuncture: comparison of registered records and subsequent publications. Trials 16, 28 (2015).
Dwan, K., Gamble, C., Williamson, P. R., Kirkham, J. J. & the Reporting Bias Group.Systematic review of the empirical evidence of study publication bias and outcome reporting bias—an updated review. PLoS ONE 8, e66844 (2013).
Wieschowski, S., Silva, D. S. & Strech, D. Animal study registries: results from a stakeholder analysis on potential strengths, weaknesses, facilitators, and barriers. PLoS Biol. 14, e2000391 (2016).
Manago, B. Preregistration and registered reports in sociology: strengths, weaknesses, and other considerations. Am. Sociol. 54, 193–210 (2023).
Costa, E., Inbar, Y. & Tannenbaum, D. Do registered reports make scientific findings more believable to the public? Collabra Psychol. 8, 32607 (2022).
Spitzer, L. & Mueller, S. Registered report: survey on attitudes and experiences regarding preregistration in psychological research. PLoS ONE 18, e0281086 (2023).
Sarafoglou, A., Kovacs, M., Bakos, B., Wagenmakers, E.-J. & Aczel, B. A survey on how preregistration affects the research workflow: better science but more work. R. Soc. Open Sci. 9, 211997 (2022).
Simes, R. J. Publication bias: the case for an international registry of clinical trials. J. Clin. Oncol. 4, 1529–1541 (1986).
Mathieu, S. Comparison of registered and published primary outcomes in randomized controlled trials. JAMA 302, 977–984 (2009).
TARG Meta-Research Group & Collaborators. Estimating the prevalence of discrepancies between study registrations and publications: a systematic review and meta-analyses. BMJ Open 13, e076264 (2023).
Chan, A.-W. et al. Increasing value and reducing waste: addressing inaccessible research. Lancet 383, 257–266 (2014).
Showell, M. et al. Publication bias in trials registered in the Australian New Zealand Clinical Trials Registry: is it a problem? A cross-sectional study. PLoS ONE 18, e0279926 (2023).
Scott, A., Rucklidge, J. J. & Mulder, R. T. Is mandatory prospective trial registration working to prevent publication of unregistered trials and selective outcome reporting? An observational study of five psychiatry journals that mandate prospective clinical trial registration. PLoS ONE 10, e0133718 (2015).
Rosati, P. et al. Major discrepancies between what clinical trial registries record and paediatric randomised controlled trials publish. Trials 17, 430 (2016).
Riveros, C. et al. Timing and completeness of trial results posted at ClinicalTrials.gov and published in journals. PLoS Med. 10, e1001566 (2013).
Karimian, Z., Mavoungou, S., Salem, J.-E., Tubach, F. & Dechartres, A. The quality of reporting general safety parameters and immune-related adverse events in clinical trials of FDA-approved immune checkpoint inhibitors. BMC Cancer 20, 1128 (2020).
Liebeskind, D. S., Kidwell, C. S., Sayre, J. W. & Saver, J. L. Evidence of publication bias in reporting acute stroke clinical trials. Neurology 67, 973–979 (2006).
Glasziou, P. & Chalmers, I. Is 85% of health research really ‘wasted’? BMJ Opinion, https://blogs.bmj.com/bmj/2016/01/14/paul-glasziou-and-iain-chalmers-is-85-of-health-research-really-wasted/ (14 January 2016).
Kimmelman, J. & Anderson, J. A. Should preclinical studies be registered? Nat. Biotechnol. 30, 488–489 (2012).
Nakagawa, S., Lagisz, M., Yang, Y. & Drobniak, S. M. Finding the right power balance: better study design and collaboration can reduce dependence on statistical power. PLoS Biol. 22, e3002423 (2024).
Mellor, D., Corker, K. S. & Whaley, P. Preregistration templates as a new addition to the evidence-based toxicology toolbox. Evid. Based Toxicol. 2, 2314303 (2024).
Zarin, D. A., Tse, T., Williams, R. J. & Rajakannan, T. Update on trial registration 11 years after the ICMJE policy was established. N. Engl. J. Med. 376, 383–391 (2017).
Viergever, R. F. & Li, K. Trends in global clinical trial registration: an analysis of numbers of registered clinical trials in different parts of the world from 2004 to 2013. BMJ Open 5, e008932 (2015).
Bruckner, T., Rodgers, F., Styrmisdóttir, L. & Keestra, S. Adoption of World Health Organization best practices in clinical trial transparency among European medical research funder policies. JAMA Netw. Open 5, e2222378 (2022).
PLOS partners with DataSeer to develop Open Science Indicators. Official PLOS Blog https://theplosblog.plos.org/2022/09/plos-partners-with-dataseer-to-develop-open-science-indicators/ (2022).
De Cheveigné, A. Preregistration: the good, the bad, and the confusing. HAL https://hal.science/hal-04063123/ (2023).
Hopewell, S. et al. An update to SPIRIT and CONSORT reporting guidelines to enhance transparency in randomized trials. Nat. Med. 28, 1740–1743 (2022).
Davidson, A. R. et al. Taxonomy of interventions at academic institutions to improve research quality. Preprint at bioRxiv https://doi.org/10.1101/2022.12.08.519666 (2022).
Chalmers, I. et al. How to increase value and reduce waste when research priorities are set. Lancet 383, 156–165 (2014).
Ioannidis, J. P. A. et al. Increasing value and reducing waste in research design, conduct, and analysis. Lancet 383, 166–175 (2014).
Salman, R. A.-S. et al. Increasing value and reducing waste in biomedical research regulation and management. Lancet 383, 176–185 (2014).
Glasziou, P. et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet 383, 267–276 (2014).
Grainger, M. J. et al. Evidence synthesis for tackling research waste. Nat. Ecol. Evol. 4, 495–497 (2020).
Tenopir, C. et al. Data sharing by scientists: practices and perceptions. PLoS ONE 6, e21101 (2011).
Walters, W. P. Code sharing in the open science era. J. Chem. Inf. Model. 60, 4417–4420 (2020).
Barnett-Page, E. & Thomas, J. Methods for the synthesis of qualitative research: a critical review. BMC Med. Res. Methodol. 9, 59 (2009).
Paterson, B. L., Thorne, S. E., Canam, C., & Jillings, C. Meta-study of Qualitative Health Research: A Practical Guide to Meta-analysis and Meta-synthesis (Sage, 2001).
Data Management General Guidance https://dmptool.org/general_guidance (DMP Tool, accessed 18 January 2024).
Badges to Acknowledge Open Practices https://osf.io/tvyxz/wiki/1.%20View%20the%20Badges/ (OSF, accessed 18 January 2024).
Fraser, H., Parker, T., Nakagawa, S., Barnett, A. & Fidler, F. Questionable research practices in ecology and evolution. PLoS ONE 13, e0200303 (2018).
Martinez-Ortiz, C. et al. Practical guide to Software Management Plans (1.0). Zenodo https://doi.org/10.5281/zenodo.7248877 (2022).
Schulz, K. F., Altman, D. G., Moher, D. & the CONSORT Group.CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. BMC Med. 8, 18 (2010).
Chan, A.-W. et al. SPIRIT 2013 Statement: defining standard protocol items for clinical trials. Ann. Intern. Med. 158, 200–207 (2013).
Moher, D., Schulz, K. F. & Altman, D. G. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet 357, 1191–1194 (2001).
Thibault, R. T., Pennington, C. R. & Munafò, M. R. Reflections on preregistration: core criteria, badges, complementary workflows. J. Trial Error https://doi.org/10.36850/mr6 (2023).
Aczel, B. et al. A consensus-based transparency checklist. Nat. Hum. Behav. 4, 4–6 (2020).
Acknowledgements
We thank F. Fidler and R. Thibault for feedback and discussion that improved our manuscript. This research was funded by the Croatian Science Foundation (HRZZ) project no. DOK-2021-02-6688 to T.K. for M.P. and by the Croatian Science Foundation (HRZZ) project EcoOpen no. IP-2022-10-2872 to A.C.
Author information
Authors and Affiliations
Contributions
Conceptualization: all authors. Data curation: M.P. and A.C. Formal analysis: M.P. and A.C. Funding acquisition: A.C. and T.K. Investigation: M.P. and A.C. Methodology: A.C. Supervision: A.C. Validation: A.C., T.K., S.N. and P.G. Visualization: M.P. and A.C. Writing—original draft: A.C. Writing—review and editing: all authors.
Corresponding author
Ethics declarations
Competing interests
A.C., S.N. and M.P. are officers at the Society for Open, Reliable, and Transparent Ecology and Evolutionary biology (SORTEE). T.K. and P.G. declare no competing interests.
Peer review
Peer review information
Nature Ecology & Evolution thanks Geoff Frampton and Alexandra Sarafoglou for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Supplementary Fig. 1, Tables 1 and 2, and sections ‘Journals offering registered reports’ and ‘Exploratory survey’.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Purgar, M., Glasziou, P., Klanjscek, T. et al. Supporting study registration to reduce research waste. Nat Ecol Evol (2024). https://doi.org/10.1038/s41559-024-02433-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41559-024-02433-5