Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-24T11:26:59.107Z Has data issue: false hasContentIssue false

Are European clinical trial funders policies on clinical trial registration and reporting improving? A cross-sectional study

Published online by Cambridge University Press:  14 July 2023

Marguerite O’Riordan*
Affiliation:
TranspariMED, Bristol, UK College of Health and Life Sciences, Aston Medical School, Aston University, Birmingham, UK
Martin Haslberger
Affiliation:
Berlin Institute of Health, Berlin, Germany
Carolina Cruz
Affiliation:
University of Guadalajara, Guadalajara, Mexico
Tarik Suljic
Affiliation:
Faculty of Medicine, University of Sarajevo, Sarajevo, Bosnia and Hercegovina
Martin Ringsten
Affiliation:
Lund University, Lund, Sweden
Till Bruckner
Affiliation:
TranspariMED, Bristol, UK
*
Corresponding author: M. O’Riordan; Email: 190011204@aston.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

Objectives:

Assess the extent to which the clinical trial registration and reporting policies of 25 of the world’s largest public and philanthropic medical research funders meet best practice benchmarks as stipulated by the 2017 WHO Joint Statement, and document changes in the policies and monitoring systems of 19 European funders over the past year.

Design, Setting, Participants:

Cross-sectional study, based on assessments of each funder’s publicly available documentation plus validation of results by funders. Our cohort includes 25 of the largest medical research funders in Europe, Oceania, South Asia, and Canada.

Interventions:

Scoring all 25 funders using an 11-item assessment tool based on WHO best practice benchmarks, grouped into three primary categories: trial registries, academic publication, and monitoring, plus validation of results by funders.

Main outcome measures:

How many of the 11 WHO best practice items each of the 25 funders has put into place, and changes in the performance of 19 previously assessed funders over the preceding year.

Results:

The 25 funders we assessed had put into place an average of 5/11 (49%) WHO best practices. Only 6/25 funders (24%) took the PI’s past reporting record into account during grant application reviews. Funders’ performance varied widely from 0/11 to 11/11 WHO best practices adopted. Of the 19 funders for which 2021(2) baseline data was available, 10/19 (53%) had strengthened their policies over the preceding year.

Conclusions:

Most medical research funders need to do more to curb research waste and publication bias by strengthening their clinical trial policies.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Key Points

  • What is already known about this topic

Strong clinical trial registration and reporting policies coupled with monitoring and sanctions can reduce research waste, curb publication bias, and promote transparency. A 2021 assessment found that 19 European medical research funders’ policies fell short of WHO best practices.

  • What this study adds

This is the first study to assess the clinical trial registration and reporting policies of a global cohort of 25 major medical research funders against WHO best practices, identifying gaps in the research waste safeguards of key players across Europe, Oceania, South Asia, and Canada. In addition, the study assesses the progress made by 19 funders in the recent past.

  • How this study might affect research, practice, or policy

This study enables funders worldwide to identify and address gaps in their clinical trial transparency policies by pinpointing exactly where they currently fall short of WHO best practices. It also enables policymakers and citizens to assess whether public bodies tasked with furthering medical knowledge have adopted adequate safeguards against research waste and publication bias.

Introduction

Research waste and publication bias in clinical trials are widespread [Reference Bonita, Adams and Whellan1Reference Goldacre, Devito and Heneghan3]. An estimated 85% of health research is wasted, with half of all waste due to non-reporting of results alone [Reference Chalmers4]. Calls to address the problem have a long history [5]. Clinical trials can only inform clinical practice and public health decision-making if and when their results have been made public [6]. However, numerous studies have consistently documented that the results of a significant proportion of clinical trials are never made public [6]. Previous research consistently shows that noncommercial trials have lower publication rates than trials run by industry [Reference Jones, Younie, MacAllister and Thornton7]. Furthermore, trials with “positive” outcomes are more likely to be published, introducing systematic bias into the medical literature [Reference Turner, Cipriani, Furukawa, Salanti and De Vries8]. Incomplete reporting of clinical trials wastes taxpayer money and leaves gaps in the scientific record [Reference Rodgers, Pepperrell, Keestra and Pilkington9]. Current legal and regulatory frameworks provide insufficient safeguards [Reference Califf, Tabak and Acting10,Reference Borysowski, Wnukiewicz-Kozlowska and Gorski11].

Since 2013, the World Medical Association’s Declaration of Helsinki has required all clinical trials to be registered and their results to be made public [6]. Public and philanthropic bodies funding clinical trials are uniquely positioned to promote transparency, reduce research waste, and curb publication bias by adopting policies requiring trialists to preregister trials and rapidly make their results public, and monitoring compliance with these rules. The 2017 World Health Organization (WHO) Joint Statement on public disclosure of results from clinical trials (hereafter “WHO Joint Statement") lists 11 specific policy, monitoring and compliance elements that funders should adopt [12].

To date, 15 funders and research bodies have formally signed up to the WHO Joint Statement and thereby committed themselves to adopting all 11 elements. Signatories pledged to require grantees to preregister trials on a WHO-linked trial registry, to make trial results public on the same registry within one year of trial completion, to publicly monitor grantees’ compliance with these policies, and to impose sanctions for noncompliance. In May 2022, a World Health Assembly resolution called on funders worldwide to mandate trial registration and reporting in line with WHO Joint Statement requirements [6].

Multiple previous studies have assessed funders’ clinical trial policies [Reference Rodgers, Pepperrell, Keestra and Pilkington9,Reference Bruckner, Rodgers, Styrmisdóttir and Keestra13,Reference Knowles, Ha, Mueller, Rawle and Parker14]. An assessment of 21 European funders conducted in 2021 used 11 items contained in the WHO Joint Statement as its benchmark [Reference Bruckner, Rodgers, Styrmisdóttir and Keestra13]. It found that funders had only adopted a mean of 4/11 (36%) of WHO best practices in clinical trial transparency. There was a wide variation in performance amongst funders, and some best practice items had been more widely adopted than others. The authors included a template policy document to facilitate the adoption of WHO best practices [Reference Bruckner, Rodgers, Styrmisdóttir and Keestra13].

We build on this previous work by assessing a broader cohort of 25 funders worldwide using the same methodology, including 19 of the European funders that had been assessed one year earlier.

Methods

Study design and reporting were performed in accordance with the Strengthening the Reporting of Observational Studies in Epidemiology reporting guideline for cross-sectional studies [Reference Field, Cohen and Struelens15].

Our starting point was a cohort of 21 of the largest philanthropic and unilateral (single entity or individual that solely provides the funding for a project without any additional contributions from other sources) public medical research funders in Europe covered by a previous assessment [Reference Bruckner, Rodgers, Styrmisdóttir and Keestra13]. We removed two funders from that earlier cohort. Bundesministerium für Gesundheit (the German Federal Ministry of Health) was removed because even though it had funded at least one COVID-19 trial in the early stages of the pandemic, it does not routinely directly fund clinical trials. Centre National De La Recherche Scientifique (France) was removed because it does not fund any clinical trials.

We then expanded the cohort of 19 European funders by six additional funders to achieve global coverage [Reference O’Riordan and Bruckner16]. We included funders regardless of whether they fund extramural or intramural research, or both. We added two multilateral funders located in Europe that had been excluded from the previous assessment (Horizon Europe and European and Developing Countries Clinical Trials Partnership); the two major funders in Oceania (National Health and Medical Research Council of Australia and Health Research Council of New Zealand); and the largest funder in South Asia (Indian Council of Medical Research). We did not include funders in the United States as these were concurrently assessed by a different study team (17). To complete coverage of major North American funders, we added Canada’s public funder (The Canadian Institutes of Health Research).

We used the same assessment tool and assessment criteria (with minor simplifications) that were used during the 2021 assessment [Reference Bruckner, Rodgers, Styrmisdóttir and Keestra13]. Scoring of funders was carried out utilizing an 11-item assessment tool based on WHO Joint Statement benchmark [12]s. The 11 items fall into three broad categories: trial registries (prospective trial registration, registry records kept up-to-date, results onto registry within 12 months, protocol onto registry within 12 months); academic publication (results published in journals, trial ID included in publications, open-access publication); and monitoring and sanctions (investigator’s past reporting record taken into account, trial registration monitored, results reporting monitored, monitoring reports made public). We scored on a YES/NO basis, only awarding points to policy items that fully met the assessment criteria. Policy items that failed to cover all trials, and nonbinding “supportive” policy items, were scored as “NO” and earned no points. Possible scores across all items ranged from 0 to 11 points. The protocol, assessment tool, rater guide, adjudication tracker, individual and consolidated score sheets, aggregated data sets, archived funder policies, and correspondence with funders are available on GitHub [Reference O’Riordan17].

We searched the websites of all 25 funders during August and September 2022 and filled out a scoring sheet for each funder, capturing relevant policy items. The six funders being assessed for the first time were independently assessed by two team members (CC and MR). The remaining 19 funders were assessed by the lead researcher (MOR).

The MOR then contacted all funders with a copy of the assessment criteria and their scoresheet in November 2022. For funders that had responded to requests to validate the 2021 assessment results, we used the email address of the person who had sent the response in 2021. For all other funders, we used the press department’s email address. One and two weeks after the initial email, a follow-up reminder was sent, copying in the press office where applicable. Funders being assessed for the first time that did not respond to our outreach were re-assessed independently a third time by another team member (MH and TS) to ensure that no salient policy elements were overlooked.

The lead researcher (MOR) then compared and merged all assessments into a single consolidated assessment sheet for each funder. As a final quality check, we reviewed the previous year’s assessments of European funders to ensure we had captured all items.

At each stage, a team member not involved in conducting assessments (TB) reviewed all items flagged as uncertain. To ensure comparability of findings across cohorts and time, he determined final scores based on precedents set during two separate studies using the same methodology [Reference Bruckner, Rodgers, Styrmisdóttir and Keestra13,Reference Gamertsfelder and Bruckner18]. These decisions were documented in an “adjudication tracker” and have been archived on GitHub [Reference O’Riordan17].

Results

Fig. 1 shows the comparative performance of all 25 funders. Blue bars represent the number of WHO best practice policies adopted by funders in 2022; whilst the green bars represent the scores of the 19 European funders previously assessed in 2021. Funders are placed in order of number of WHO best practices adopted in 2022, those with the most best practices adopted being at the top. On average, funders had adopted 5.4/11 WHO best practices in clinical trial transparency (49%).

Figure 1. Number of WHO best practices adopted per funder (maximum = 11).

Funders’ performance varied widely. The UK National Institute for Health Research was the only funder that had adopted all 11 policies (100%), followed closely by the Wellcome Trust with 10/11 policies (91%). In contrast, Italy’s Ministry of Health and Instituto de Salud Carlos III (ISCIII) both failed to score any points.

Out of the previously assessed European funders, more than half (10/19, 53%) had strengthened their policies over the preceding year, in many cases substantially. The Swedish Research Council and France’s Inserm made the largest gains. The average number of policies adopted by this cohort rose from 4/11 items (36%) to 5.5/11 items (50%) during 2021-2022.

Fig. 2 shows which of the 11 policy items had been most widely adopted by the 25 funders in our cohort.

Figure 2. Number of funders adopting specific policy items (maximum = 25).

Open-access publication was the most widely adopted policy; 19/25 funders (76%) now require this. In contrast, less than a third of funders (8/25 funders, 32%) require their grantees to make their trial protocols publicly available on registries.

In total, 21/25 funders (84%) mandate prospective trial registration, despite this being a global ethics requirement. Nearly half of funders (12/25 funders, 48%) require trial results to be made public on registries within a year of trial completion, a key mechanism for speeding up the disclosure of research outcomes.

A majority of funders (14/25 funders, 56%) monitor whether grantees register trials and make results public. However, when deciding whether to award new grants, just over a third of funders (9/25 funders, 36%) take into account whether applicants have made trial results public in the past.

Fig. 3 shows which policy items the 19 previously assessed European funders added during 2021–2022.

Figure 3. Policy items added by 19 European funders during 2021–2022.

Funders adopted new policy items across the whole range of 11 WHO best practices. The only exception was open-access publication, for which the baseline was already very high; note that open-access policies tend to be set at the wider institutional level rather than specifically for clinical research.

For other policy items, growth in uptake among funders was uneven. The most frequently added new policy items were inclusion of clinical trial registry ID numbers in publications and requirements to keep registry records up-to-date; each of these items was adopted by six additional funders. Of note, three funders initiated compliance monitoring activities during 2021–2022.

Funders’ efforts to strengthen their policies appear to have sometimes been ad hoc rather than systematic. For example, 11 funders now require grantees to keep registry records up-to-date, a task that requires diligence from trialists throughout the life cycle of a trial. In contrast, only 11 funders require grantees to include trial ID numbers in publications, which is a very simple action that typically needs to be performed only once after the end of a trial.

Fig. 4 provides a granular overview of the individual policy items adopted by each of the 25 funders. We include this figure to enable funders to identify and address remaining gaps in their policies.

Figure 4. Policy items adopted by each funder and remaining gaps.

The “YES” fields denote mandatory policy items that apply to all clinical trials. The “NO” fields denote the complete absence of a policy item.

In 15 instances, funders encouraged a practice but did not make it compulsory. Of note, nine funders encouraged results to be published in journals, but did not make this compulsory. In one instance, the scope of a policy item was limited to drug trials only. Nonbinding policies, where a funder encourages a practice but does not mandate it, are marked with “NO*” below. Non-comprehensive policies that apply to only some types of trials are marked with “NO#.”

An overview of nonbinding and non-comprehensive policies is provided in the Supplement.

Strengths and Limitations

This is the first global assessment of medical research funders’ clinical trial policies that is fully based on WHO best practice benchmarks. Independent ratings, review, and consolidation by a third researcher, transparent adjudication, and respondent validation strengthened data quality and reliability. The archiving of all project tools and documentation on GitHub enables independent replication, including with other cohorts. A visual aid (Fig. 4) enables funders to easily identify gaps in their policies, and a template policy document (Supplement) supports funders’ efforts to address remaining gaps.

Our study has two limitations. Of the 25 funders, 11 did not respond to our outreach despite repeated efforts and a deadline extension. As a result, relevant policy items for those funders may have been missed, especially if they were not publicly accessible online. For example, the Swiss National Science Foundation noted that although our assessment accurately reflected publicly available information, they had additional requirements that were not openly accessible.

The second limitation is that funder policies do not necessarily translate into improvements in actual practice, especially if funders do not actively monitor grantees’ compliance with their requirements [Reference Rees, Narang, Westbrook and Bourgeois19,20].

Our study provides a useful starting point for other researchers to assess to what degree and under what conditions funder policies influence clinical trial registration and reporting in practice.

Discussion

Our study shows that medical research funders vary widely in their adoption of WHO best practices. Even though 15/25 (60%) funders in our cohort have formally committed to adopting all 11 policy items by signing up to the WHO Joint Statement, only a single funder in the cohort has fully delivered on its promise so far. On the positive side, several funders have significantly strengthened their policies over the past year, and a third of funders have by now put into place nine or more of the 11 WHO policy items. While many funders still need to do more to curb research waste and publication bias, the trend is clearly positive. We plan to reassess all funders in the future to document further improvements.

We also found large variations in the frequency with which individual policy items had been adopted by funders. Two key findings were the adoption of open-access policies by a large majority of funders, and the failure of four funders to require all clinical trials to be registered. The latter is both surprising and deeply disappointing because trial registration is a long-standing global ethics requirement [6], a precondition for publication in a peer-reviewed journal and a WHO best practice [Reference De Angelis, Drazen and Frizelle21].

Our data indicate that existing research waste safeguards could often be strengthened at no cost to the funder itself and at negligible cost to grantees. In some cases, funders require grantees to perform time-intensive tasks without requiring related simple tasks to be concurrently performed. For example, several funders require summary results to be uploaded onto trial registries but do not require grantees to upload study protocols at the same time, which could be done within a few minutes. Other funders mandate journal publication but do not require grantees to copy and paste trial ID numbers into their scientific papers. Fig. 4 can help funders identify such potential easy wins.

Conclusion

The UK’s National Institute for Health Research has fully adopted all WHO best practices in clinical trial transparency. Several other funders have also put strong research waste safeguards in place. While many funders’ policies still fall significantly short of WHO best practices, average funder performance appears to be improving.

Each of the 11 WHO best practices has been adopted by at least eight funders in our cohort, demonstrating feasibility. As the WHO has noted, the resource allocation, public health, and scientific benefits of rapid and comprehensive outcome reporting far outweigh the modest implementation costs [12]. Funders’ experiences to date show that effective research waste safeguards can be put into place without antagonizing grantees or burdening them with excessive red tape [Reference Knowles, Ha, Mueller, Rawle and Parker14,2224].

We urge funders to further strengthen their policies, and to concurrently support and adequately compensate their grantees’ trial registry management and outcome reporting efforts. Recent experience shows that strong funder policies alone are insufficient to prevent research waste [Reference Field, Cohen and Struelens15], so it is essential for funders to monitor grantees’ compliance. We encourage funders to use such monitoring data to identify and highlight both strong and weak performers.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/cts.2023.590.

Funding statement

This study did not receive any funding. The article processing fees were supported by Aston University. The content is solely the responsibility of the authors and does not necessarily represent the official views of Aston University.

Competing interests

The authors have no conflicts of interest to declare.

Ethical standard

NHS REC ethics waiver was obtained on 18 October 2022.

Data availability statement

All data relevant to the study is included in the article or uploaded as supplementary information.

References

Bonita, R, Adams, S, Whellan, D. Reporting of clinical trials: publication, authorship, and trial registration - clinicalKey [Internet]. Heart Fail Clin. 2011;7(4):561567.CrossRefGoogle Scholar
Zheutlin, AR, Niforatos, J, Stulberg, E, Sussman, J. Research waste in randomized clinical trials: a cross-sectional analysis. J Gen Intern Med. 2020;35(10):31053107.CrossRefGoogle ScholarPubMed
Goldacre, B, Devito, NJ, Heneghan, C, et al. Compliance with requirement to report results on the EU clinical trials register: cohort study and web resource. BMJ. 2018;362:k3218.CrossRefGoogle ScholarPubMed
Chalmers, PGI. Paul Glasziou and Iain Chalmers: Is 85% of health research really “wasted”? BMJ. 2016.Google Scholar
World Health Organisation. Strengthening clinical trials 1 to provide high-quality evidence on health interventions and to improve research quality and coordination Draft resolution proposed by Argentina, Peru, United Kingdom of Great Britain and Northern Ireland [Internet]. 2017. https://www.who.int/news/item/18-05-2017-joint-statement-on-registration. [Accessed 23, March 2023].CrossRefGoogle Scholar
Association WM. World medical association declaration of Helsinki: ethical principles for medical research involving human subjects. JAMA. 2013;310(20):21912194.CrossRefGoogle Scholar
Jones, R, Younie, S, MacAllister, A, Thornton, J. A comparison of the scientific quality of publicly and privately funded randomized controlled drug trials. J Eval Clin Pract. 2010;16(6):13221325.CrossRefGoogle ScholarPubMed
Turner, EH, Cipriani, A, Furukawa, TA, Salanti, G, De Vries, YA. Selective publication of antidepressant trials and its influence on apparent efficacy: Updated comparisons and meta-analyses of newer versus older trials. PLoS Med. 2022;19:1.CrossRefGoogle ScholarPubMed
Rodgers, F, Pepperrell, T, Keestra, S, Pilkington, V. Missing clinical trial data: the evidence gap in primary data for potential COVID-19 drugs. Trials. 2021;22:1.CrossRefGoogle ScholarPubMed
Califf, RM, Tabak, LA, Acting, DDS. Citizen petition from universities allied for essential medicines north america to the food & drug administration for increased enforcement of the ClinicalTrials.gov reporting requirements in the food and drug administration amendments act of 2007.Google Scholar
Borysowski, J, Wnukiewicz-Kozlowska, A, Gorski, A. Legal regulations, ethical guidelines and recent policies to increase transparency of clinical trials. Br J Clin Pharmacol [Internet]. 2020; Available from: https://pubmed.ncbi.nlm.nih.gov/32017178/ CrossRefGoogle Scholar
World Health Organisation. Joint statement on public disclosure of results from clinical trials [Internet]. 2017. https://www.who.int/news/item/18-05-2017-joint-statement-on-registration. [Accessed 22, March 2023].Google Scholar
Bruckner, T, Rodgers, F, Styrmisdóttir, L, Keestra, S. Adoption of world health organization best practices in clinical trial transparency among european medical research funder policies. JAMA Netw Open. 2022;5(8):e2222378.CrossRefGoogle ScholarPubMed
Knowles, RL, Ha, KP, Mueller, J, Rawle, F, Parker, R. Challenges for funders in monitoring compliance with policies on clinical trials registration and reporting: analysis of funding and registry data in the UK. BMJ Open. 2020;10(2):35283.CrossRefGoogle ScholarPubMed
Field, N, Cohen, T, Struelens, MJ, et al. Strengthening the reporting of molecular epidemiology for infectious diseases (STROME-iD): an extension of the STROBE statement. Lancet Infect Dis. 2014;14(4):341352.CrossRefGoogle ScholarPubMed
O’Riordan, M, Bruckner, T. EU funders 2022 update protocol_TB20220824.pdf. 2022 Aug 24. osf.io/6w3nq. [Accessed 23, March 2023].Google Scholar
O’Riordan, M. Consilium-Scientific-Project/README.md at main · mriordan7/Consilium-Scientific-Project · GitHub [Internet]. https://github.com/mriordan7/Consilium-Scientific-Project/blob/main/README.md. [Accessed 23, March 2023].Google Scholar
Gamertsfelder, E, Bruckner, T. Protocol-US_Final_EG2022_v5_TB20220613.pdf. 2022 Jun 13. https://osf.io/https://osf.io/ysjzf. [Accessed 23, March 2023].Google Scholar
Rees, CA, Narang, C, Westbrook, A, Bourgeois, FT. Dissemination of the results of pediatric clinical trials funded by the US national institutes of health. JAMA. 2023;329(7):590592.CrossRefGoogle ScholarPubMed
National Institutes of Health. NIH Fails to Enforce Rules for Reporting Clinical Trial Results | The Scientist Magazine® [Internet]. 2023 https://www.the-scientist.com/news-opinion/nih-fails-to-enforce-rules-for-reporting-clinical-trial-results-70392. [Accessed 26, March 2023].Google Scholar
De Angelis, C, Drazen, JM, Frizelle, FA, et al. Clinical trial registration: a statement from the international committee of medical journal editors. C Can Med Assoc J. 2004;171(6):606.CrossRefGoogle ScholarPubMed
UK Research and Innovation. MRC review of clinical trials – UKRI [Internet]. 2020. https://www.ukri.org/about-us/mrc/our-policies-and-standards/data-report-collections/review-of-clinical-trials/. [Accessed 23, March 2023].Google Scholar
Bermejo, A. Wellcome clinical trial policy monitoring 2018-2022 [Internet]. 2022 https://wellcome.org/grant-funding/guidance/wellcome-clinical-trial-policy-monitoring-2018-2022. [Accessed 23, March 2023].Google Scholar
National Institute for Health and Care Research. Audit of compliance with NIHR clinical trial registration requirements (2019-20) | NIHR [Internet]. 2021 https://www.nihr.ac.uk/documents/audit-of-compliance-with-nihr-clinical-trial-registration-requirements-2019-20/26840. [Accessed 23, March 2023].Google Scholar
Figure 0

Figure 1. Number of WHO best practices adopted per funder (maximum = 11).

Figure 1

Figure 2. Number of funders adopting specific policy items (maximum = 25).

Figure 2

Figure 3. Policy items added by 19 European funders during 2021–2022.

Figure 3

Figure 4. Policy items adopted by each funder and remaining gaps.

Supplementary material: File

O’Riordan et al. supplementary material

O’Riordan et al. supplementary material

Download O’Riordan et al. supplementary material(File)
File 25.1 KB