Access to unpublished clinical study reports (CSRs) is currently being discussed as a means to allow unbiased evaluation of clinical research. The Institute for Quality and Efficiency in Health Care (IQWiG) routinely requests CSRs from manufacturers for its drug assessments.
Our objective was to determine the information gain from CSRs compared to publicly available sources (journal publications and registry reports) for patient-relevant outcomes included in IQWiG health technology assessments (HTAs) of drugs.
Methods and Findings
We used a sample of 101 trials with full CSRs received for 16 HTAs of drugs completed by IQWiG between 15 January 2006 and 14 February 2011, and analyzed the CSRs and the publicly available sources of these trials. For each document type we assessed the completeness of information on all patient-relevant outcomes included in the HTAs (benefit outcomes, e.g., mortality, symptoms, and health-related quality of life; harm outcomes, e.g., adverse events). We dichotomized the outcomes as “completely reported” or “incompletely reported.” For each document type, we calculated the proportion of outcomes with complete information per outcome category and overall.
We analyzed 101 trials with CSRs; 86 had at least one publicly available source, 65 at least one journal publication, and 50 a registry report. The trials included 1,080 patient-relevant outcomes. The CSRs provided complete information on a considerably higher proportion of outcomes (86%) than the combined publicly available sources (39%). With the exception of health-related quality of life (57%), CSRs provided complete information on 78% to 100% of the various benefit outcomes (combined publicly available sources: 20% to 53%). CSRs also provided considerably more information on harms. The differences in completeness of information for patient-relevant outcomes between CSRs and journal publications or registry reports (or a combination of both) were statistically significant for all types of outcomes.
The main limitation of our study is that our sample is not representative because only CSRs provided voluntarily by pharmaceutical companies upon request could be assessed. In addition, the sample covered only a limited number of therapeutic areas and was restricted to randomized controlled trials investigating drugs.
People assume that, when they are ill, health care professionals will ensure that they get the best available treatment. In the past, clinicians used their own experience to make decisions about which treatments to offer their patients, but nowadays, they rely on evidence-based medicine—the systematic review and appraisal of clinical trials, studies that investigate the benefits and harms of drugs and other medical interventions in patients. Evidence-based medicine can guide clinicians, however, only if all the results of clinical research are available for evaluation. Unfortunately, the results of trials in which a new drug performs better than existing drugs are more likely to be published than those in which the new drug performs badly or has unwanted side effects (publication bias). Moreover, trial outcomes that support the use of a new treatment are more likely to be published than those that do not support its use (outcome reporting bias). Both types of bias pose a substantial threat to informed medical decision-making.
Why Was This Study Done?
Recent initiatives, such as making registration of clinical trials in a trial registry (for example, ClinicalTrials.gov) a precondition for publication in medical journals, aim to prevent these biases but are imperfect. Another way to facilitate the unbiased evaluation of clinical research might be to increase access to clinical study reports (CSRs)—detailed but generally unpublished accounts of clinical trials. Notably, information from CSRs was recently used to challenge conclusions based on published evidence about the efficacy and safety of the antiviral drug oseltamivir and the antidepressant reboxetine. In this study, the researchers compare the information available in CSRs and in publicly available sources (journal publications and registry reports) for the patient-relevant outcomes included in 16 health technology assessments (HTAs; analyses of the medical implications of the use of specific medical technologies) for drugs; the HTAs were prepared by the Institute for Quality and Efficiency in Health Care (IQWiG), Germany's main HTA agency.
What Did the Researchers Do and Find?
The researchers searched for published journal articles and registry reports for each of 101 trials for which the IQWiG had requested and received full CSRs from drug manufacturers during HTA preparation. They then assessed the completeness of information on the patient-relevant benefit and harm outcomes (for example symptom relief and adverse effects, respectively) included in each document type. Eighty-six of the included trials had at least one publicly available data source; the results of 15% of the trials were not available in either journals or registry reports. Overall, the CSRs provided complete information on 86% of the patient-related outcomes, whereas the combined publicly available sources provided complete information on only 39% of the outcomes. For individual outcomes, the CSRs provided complete information on 78%–100% of the benefit outcomes, with the exception of health-related quality of life (57%); combined publicly available sources provided complete information on 20%–53% of these outcomes. The CSRs also provided more information on patient-relevant harm outcomes than the publicly available sources.
What Do These Findings Mean?
These findings show that, for the clinical trials considered here, publicly available sources provide much less information on patient-relevant outcomes than CSRs. The generalizability of these findings may be limited, however, because the trials included in this study are not representative of all trials. Specifically, only CSRs that were voluntarily provided by drug companies were assessed, a limited number of therapeutic areas were covered by the trials, and the trials investigated only drugs. Nevertheless, these findings suggest that access to CSRs is important for the unbiased evaluation of clinical trials and for informed decision-making in health care. Notably, in June 2013, the European Medicines Agency released a draft policy calling for the proactive publication of complete clinical trial data (possibly including CSRs). In addition, the European Union and the European Commission are considering legal measures to improve the transparency of clinical trial data. Both these initiatives will probably only apply to drugs that are approved after January 2014, however, and not to drugs already in use. The researchers therefore call for CSRs to be made publicly available for both past and future trials, a recommendation also supported by the AllTrials initiative, which is campaigning for all clinical trials to be registered and fully reported.
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001526.
- Wikipedia has pages on evidence-based medicine, publication bias, and health technology assessment (note: Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
- The ClinicalTrials.gov website is a searchable register of federally and privately supported clinical trials in the US; it provides information about all aspects of clinical trials
- The European Medicines Agency (EMA) provides information about all aspects of the scientific evaluation and approval of new medicines in the European Union, and guidance on the preparation of clinical study reports; its draft policy on the release of data from clinical trials is available
- Information about IQWiG is available (in English and German); Informed Health Online is a website provided by IQWiG that provides objective, independent, and evidence-based information for patients (also in English and German)
Citation: Wieseler B, Wolfram N, McGauran N, Kerekes MF, Vervölgyi V, Kohlepp P, et al. (2013) Completeness of Reporting of Patient-Relevant Clinical Trial Outcomes: Comparison of Unpublished Clinical Study Reports with Publicly Available Data. PLoS Med 10(10): e1001526. https://doi.org/10.1371/journal.pmed.1001526
Academic Editor: Davina Ghersi, National Health & Medical Research Council, Australia
Received: May 10, 2013; Accepted: August 29, 2013; Published: October 8, 2013
Copyright: © 2013 Wieseler et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: This work was supported by the Institute for Quality and Efficiency in Health Care (IQWiG). No external funding was received. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript
Competing interests: All authors are employees of the Institute for Quality and Efficiency in Health Care (IQWiG). To produce unbiased HTA reports, the Institute depends on access to all of the relevant data on the topic under investigation. The authors therefore support public access to clinical study reports.
Abbreviations: AE, adverse event; CSR, clinical study report; EMA, European Medicines Agency; HRQoL, health-related quality of life; HTA, health technology assessment; IQWiG, Institute for Quality and Efficiency in Health Care; SAE, serious adverse event
Publication bias and outcome reporting bias pose a substantial threat to the validity of clinical research findings and thus to informed decision-making in health care [1,2]. In recent years major initiatives to prevent or at least identify these biases have been implemented, such as registration of clinical trials as a precondition for publication in medical journals in 2005 , or mandatory trial registration and reporting of methods and results in ClinicalTrials.gov following the Food and Drug Administration Amendments Act of 2007 . However, the application of these measures has been insufficient [5–8], and they also contain several loopholes . For instance, the measures do not apply to clinical trials completed before 2005 and 2007, respectively, and provide only summarized information, preventing full evaluation.
Various types of formats exist for reporting clinical trials of drugs: journal publications and reports from trial registries and results databases—hereafter referred to as “registry reports”—make summaries of trials publicly available (e.g., to clinicians and authors of systematic reviews). These publicly available formats currently represent the main information source for clinical and health policy decision-making. Reporting standards for these two formats include the Consolidated Standards of Reporting Trials (CONSORT ) for journal publications and the Food and Drug Administration Amendments Act for registry reports on trials of US Food and Drug Administration–regulated drugs and medical devices . In contrast to the first two formats, clinical study reports (CSRs) are detailed accounts of trials generally prepared following the International Conference on Harmonisation's Guideline for Industry: Structure and Content of Clinical Study Reports (ICH E3 ). The value of additional information from CSRs in drug assessment has been shown in the cases of the antiviral oseltamivir (Tamiflu) and the antidepressant reboxetine, in which conclusions on these drugs based on published evidence alone were challenged and in part even reversed by unpublished information from CSRs [12,13].
So far, CSRs are used to inform regulatory decision-making, but are in general not publicly available. The few cases in which CSRs have been used for drug evaluation outside regulatory agencies required major efforts by researchers to gain access to the documents [14–16]. However, the European Medicines Agency (EMA) has launched an initiative to improve transparency in clinical research by providing unpublished clinical trial data [17,18]. This initiative also involves a discussion of the data formats to be made publicly available , and CSRs are being considered, in addition to individual patient data . Furthermore, legal measures to improve transparency have been proposed by the European Commission and the European Parliament [21,22], also addressing the extent of trial data to be published. Thus, the role of CSRs for the evaluation of clinical trials is currently of particular importance, and we would like to further inform the current debate with our experiences.
Health Technology Assessments of Drugs at the Institute for Quality and Efficiency in Health Care
The Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen; IQWiG), established in 2004, is Germany's main health technology assessment (HTA) agency. Its primary responsibility is the production of HTA reports on drugs and non-drug interventions based on the analysis of patient-relevant outcomes, i.e., outcomes describing morbidity, mortality, and health-related quality of life (HRQoL). These reports inform health policy decision-making in the German statutory health care system. IQWiG attempts to obtain the most complete information possible for its HTAs. For this purpose, during the preparation of a drug report, besides systematically searching bibliographic databases and trial (results) registries, we routinely ask the manufacturer to provide an overview of sponsored published and unpublished clinical trials of the drug under assessment. From this list we select the trials deemed relevant to the assessment and ask the manufacturer to submit the full CSRs. However, except for early assessments of new drugs (which are not the subject of this article), the manufacturer is not obliged to provide CSRs.
Previous Study of Clinical Study Reports versus Publicly Available Sources
In a previous study investigating the availability of information on methods and selected outcomes of clinical trials in different types of reporting formats, we used the pool of randomized controlled trials and corresponding documents (CSRs, journal publications, registry reports) included in HTAs of drugs prepared by IQWiG (see below). This previous study showed that journal publications and registry reports had different strengths and weaknesses and that, overall, the CSRs provided considerably more complete information on items relating to methods and selected outcomes than publicly available sources .
Rationale for Current Study
The previous study investigated only a limited range of outcomes, i.e., primary outcomes (irrespective of whether they were patient-relevant or not) and some adverse event (AE) outcomes. However, as stated, our HTAs are generally based on a wide range of patient-relevant outcomes (irrespective of whether they are primary outcomes or not). We hypothesized that the information gain from CSRs versus publicly available sources could be even greater for patient-relevant outcomes (which are often non-primary) than for the subset of outcomes investigated in the previous study. In the current study we therefore investigated the information gain for all patient-relevant outcomes included in our HTAs. We also aimed to characterize the information gain from CSRs for various types of patient-relevant outcomes.
The methods for the current study were largely based on those described previously . In the previous study, we included all HTAs of drugs finalized by IQWiG between 15 January 2006 and 14 February 2011, which—besides a systematic search for journal publications—contained a systematic search for registry reports as part of the information retrieval process. The systematic search generally covered MEDLINE, Embase, and the databases of the Cochrane Library, as well as ClinicalTrials.gov, the International Clinical Trials Registry Platform of the World Health Organization, the Clinical Trials Portal of the International Federation of Pharmaceutical Manufacturers and Associations, the Clinical Study Results Database of the Pharmaceutical Research and Manufacturers of America, and the trial registries and results databases of the manufacturers of the drugs under investigation. In addition, for all HTAs, CSRs were requested from the manufacturers of the drugs under assessment.
In the previous study we included all 286 trials and corresponding documents (101 CSRs, 192 journal publications, and 78 registry reports) considered in the 16 HTAs. For the current study, we used the same pool of HTAs, but included only the 101 trials from the original pool of 286 trials for which the manufacturer had provided a full CSR. “Full” referred to the availability of a core text (including a full description of methods and results) and all tables and figures, as well as appendices (e.g., protocol or statistical analysis plan) if they were referenced in the core text with only insufficient information provided in the text. None of these CSRs were publicly available at the time of preparation of the HTAs.
As stated, the previous study investigated the reporting of only a limited set of trial outcomes in the various reporting formats. In contrast, our current study aimed to characterize reporting in CSRs versus publicly available documents for all patient-relevant outcomes considered in our HTAs. These outcomes had been prespecified in the HTA protocols during the preparation of the 16 HTAs, and had been identified in the three reporting formats by systematically screening all of the available CSRs, registry reports, and journal publications.
We entered all data for the previous and current study into a Microsoft Access database. The database contained information on the characteristics of the HTA, the type of document available, as well as basic trial and document characteristics (see Table 1). For the current study we also entered data on the reporting quality of all patient-relevant outcomes as described below. In addition, we classified these outcomes as mortality, clinical events, symptoms, or HRQoL (benefit outcomes), as well as AEs, serious AEs (SAEs), AEs of special interest, or withdrawals due to AEs (harm outcomes); please see Table 2 and Box 1 for coding definitions.
Box 1. Coding of Outcome Categories
Mortality (benefit outcome): Any event/complication of the disease resulting in death, i.e., overall mortality and event-specific mortality, e.g., fatal myocardial infarction in diseases in which myocardial infarction is a late complication of the disease.
Clinical events (benefit outcome): Any event (other than an AE) based on a clinical diagnosis, e.g., nonfatal stroke or nonfatal myocardial infarction (if complication of investigated disease), asthma exacerbation.
Symptoms (benefit outcome): Any signs of the disease based on the description by the patient, e.g., asthma symptoms, pain, symptoms of depression.
Health-related quality of life (benefit outcome): Trial outcomes based on multidimensional questionnaires describing the impact of the disease and its treatment on physical, psychological, and social functioning and well-being, e.g., outcomes based on the Short Form 36 Questionnaire or the Asthma Quality of Life Questionnaire.
AE categories (harm outcome): Trial outcomes specified as AEs, SAEs, or withdrawals due to AEs based on the definitions used in the CSRs (usually according to definitions for clinical safety data management according to the International Conference on Harmonisation).
Our requirements for complete reporting of patient-relevant outcomes were based on the requirements of authors of systematic reviews (i.e., provision of adequate information for assessment of risk of bias and adequate data for meta-analyses) . Completeness of the information provided for the patient-relevant outcomes was recorded as (1) completely reported including numerical data, (2) partly reported including numerical data, (3) verbally reported without numerical data, or (4) not reported. A definition of all categories is provided in Table 2. In the assessment of completeness of information in journal publications, if more than one journal publication was available and an outcome was completely reported in one publication but not in the other(s), reporting of the outcome was still classified as “complete.”
All data for the current study were extracted and coded by one author. All data from registry reports and all classifications of patient-relevant outcomes were independently checked by a second author. In addition, a random sample of 10% of the data and codings for trial outcomes from CSRs and journal publications was also independently checked by a second author (agreement between authors for CSRs: 99%; for journal publications: 97%). Discrepancies were resolved by consensus, if necessary, after discussion with a third author.
To quantify the information gain through CSRs, we calculated the proportion of outcomes with complete reporting (category 1 above) and incomplete reporting (categories 2–4 above) for CSRs and publicly available sources (journal publications, registry reports, and the combination of both). Besides presenting the dichotomous categories “complete reporting” versus “incomplete reporting,” we also presented separately the three categories of incomplete reporting (categories 2–4 above). In addition, we performed direct comparisons of trials for which CSRs as well as journal publications and/or registry reports were available. To investigate completeness of reporting over time, we calculated the proportion of outcomes with complete reporting in the different document types stratified by year of finalization of the CSRs.
The proportion of outcomes with complete reporting was compared between CSRs and journal publications or registry reports (or a combination of both) using the McNemar test to take the potential dependency of samples into account. The data were analyzed using SAS 9.2.
The manuscripts of the previous and current study show a minor overlap of results data: as AEs were investigated under the research questions of both studies, both manuscripts report the proportion of outcomes with complete information in CSRs for AEs (92%), SAEs (88%), and withdrawals due to AEs (91%) (see Table 3).
Table 1 shows the characteristics of the trials, documents, and patient-relevant outcomes included in our sample. We analyzed 101 trials with CSRs. These CSRs were prepared between 24 September 1989 and 29 January 2010. The pool of clinical trials included nearly 70,000 patients and covered six different therapeutic areas (mainly depression and type I and II diabetes; the drugs assessed are listed by therapeutic area in Table 4). Of the 101 trials, 90 were efficacy trials; 86 had at least one publicly available source, 65 had at least one journal publication, and 50 had a registry report. For 15 trials, the CSR was the only source of information available.
The 101 trials included 1,080 outcomes classified by IQWiG as patient-relevant and considered in the pool of HTAs. Among the benefit outcomes, symptoms were investigated most often, whereas HRQoL was investigated least often. Among the harm outcomes, overall rates of AEs, SAEs, and withdrawals due to AEs were available for each trial; the harm outcomes considered most often were AEs of special interest in the given indication.
Overall Completeness of Information in Clinical Study Reports versus Publicly Available Sources
Table 3 shows the completeness of information for trial outcomes by reporting format in the full trial sample. The CSRs provided complete information on a considerably higher proportion of patient-relevant outcomes (86%) than journal publications and registry reports, even if these two sources were combined (39%). With the exception of HRQoL (57%), CSRs provided complete information on 78% to 100% of benefit outcomes. The highest value was achieved for mortality (100%). The corresponding values for combined publicly available sources were considerably lower; completeness of reporting ranged from 20% to 53%. CSRs provided complete information on 84% to 92% of harm outcomes. Again, the corresponding values for the combined publicly available sources were considerably lower (27% to 72%). The comparison of journal publications and registry reports showed that, overall, completeness of information was similar for benefit outcomes (19%) and harm outcomes (25% to 26%). However, when specific outcomes were considered, the two reporting formats showed different levels of completeness (e.g., for clinical events or the overall rate of AEs).
The differences in completeness of information for patient-relevant outcomes between CSRs and journal publications or registry reports (or a combination of both) were statistically significant for all types of outcomes (see Table 5).
Publication Bias and Outcome Reporting Bias
In addition to analyzing the proportion of outcomes for which complete information was available, we also aimed to further describe the reporting of outcomes with incomplete information. Table 6 presents the pattern of reporting of all patient-relevant outcomes in journal publications and/or registry reports. The data show that most outcomes that were not reported completely were not reported at all, except for outcomes on symptoms and HRQoL, which were reported partly with data or only verbally without data in 35% to 40% of cases. However, also for these two outcomes a large proportion of outcomes were not available at all from publicly available sources. Non-availability of outcomes was due either to lack of reporting of these outcomes even though a publication and/or registry report was available (34% of all outcomes, outcome reporting bias) or to lack of reporting of the whole trial (13%, publication bias). Tables S1 and S2 show the same type of analysis for journal publications and registry reports separately. Table 4 provides examples of the numerous patient-relevant outcomes (including outcomes of major clinical relevance, such as overall mortality and potentially life-threatening events) not reported in the publicly available sources, by therapeutic area and outcome category.
Matched Pairs of Clinical Study Reports versus Publications and/or Registry Reports
The results presented so far describe the completeness of information in publicly available reporting formats for a given sample of trials (101 trials with CSRs). Part of the differences described resulted from the fact that journal publications or registry reports were not available for all trials in our sample, as it also included unpublished trials only reported in CSRs. To investigate whether CSRs provided superior information when they were directly compared to the corresponding journal publications and registry reports, we analyzed the completeness of information in CSRs versus the publicly available sources in samples including only trials for which the respective source was available (see Tables 7 and S3–S5). Overall, each of these analyses confirmed that a substantial amount of additional information on patient-relevant outcomes is gained from CSRs compared to journal publications or registry reports (or a combination of both), even for published trials.
Completeness of Reporting over Time
To investigate completeness of reporting over time, we analyzed the availability of trial reports in publicly available sources as well as the proportion of completely reported outcomes in the three document types over time (Table 8). The analysis showed an increasing availability of trials in combined publicly available sources over time (from 71% to 95%). However, the proportion of trials available in journal publications dropped to about 50% for trials with CSRs finalized between 2005 and 2010. We hypothesized that this decrease could have been caused by the fact that the temporal proximity of the literature searches in the HTAs and the finalization date of the CSR had not allowed sufficient time for preparation and publication of a manuscript. We therefore performed a sensitivity analysis in which all trials with a CSR finalization date less than 2 y before the search date of the HTA were classified as published in a journal. This “best-case scenario” resulted in an availability rate of trials in journal publications of 81%. We also performed the same type of analysis for registry reports; the corresponding rate was 93%.
However, in our sample, high availability rates of trial reports in publicly available sources did not result in high rates of completely reported patient-relevant outcomes: for instance, even for trials for which the availability rate in combined publicly available sources was more than 90%, less than 50% of patient-relevant outcomes were completely reported. In contrast, after 1995, CSRs consistently provided complete information for more than 90% of patient-relevant outcomes.
Summary of Findings
To our knowledge the current study quantifies for the first time how much information on a wide range of patient-relevant outcomes included in a large pool of clinical trials can be gained from making full CSRs available. Our findings show that a substantial amount of information on patient-relevant outcomes required for unbiased trial evaluation is missing from the public record. This is all the more important as such outcomes are preferably considered in comparative effectiveness research and consequently in health policy and clinical decision-making [25,26]. At the same time, this information can be obtained from CSRs, i.e., from documents routinely prepared by sponsors of clinical trials, but not usually made publicly available. Over twice as much information on patient-relevant outcomes can be gained from CSRs than from publicly available sources (86% versus 39% completely reported outcomes). Moreover, CSRs not only provide patient-relevant information in cases where journal publications and registry reports are missing, they also present additional information in cases where trials have been reported in journals or registries. The differences in information gain from the different reporting formats are due to a general superiority of CSRs over publicly available sources, demonstrated by the higher proportion of completely reported outcomes in a matched sample of CSRs and publicly available documents. Our findings also again confirm the existence of considerable publication and outcome reporting bias in clinical research: 36% of the trials in our pool were not published in journal publications, 15% had no publicly available reports at all, and even for trials with publicly available reports, 34% of patient-relevant outcomes, including outcomes of major clinical relevance, were not reported.
Our analysis of completeness of reporting over time showed that although the rate of trials made available in journal publications and registry reports is increasing, the rate of completeness of information on patient-relevant outcomes in these sources is not. These findings show that new approaches are needed. It is insufficient to aim for a journal publication rate of 100%. What is needed is public availability of CSRs, and thus of documents presenting trial results to a level of detail required for full evaluation of a trial.
Comparison with Previous Research
Because of the fact that CSRs are generally not publicly available, only a few researchers have investigated their content as well as their possible role in providing information on clinical trials. Doshi and Jefferson analyzed a sample of 78 CSRs and showed that CSRs had a median length of about 450 pages of text and main tables plus an additional 550 pages of efficacy and safety listings . Vedula et al. compared unpublished internal company documents from the gabapentin litigation case (unpublished protocols, statistical analysis plans, and research reports) with trial publications [28,29]. Besides identifying several inconsistencies in the corresponding trial publications, they found that the unpublished documents provided more extensive documentation of methods planned and used, as well as trial findings. The research already cited analyzing CSRs on oseltamivir (Tamiflu) and reboxetine showed that prior conclusions on a drug's benefits and harms based on published evidence alone could no longer be upheld when information from CSRs became available [12,13]. Our previous study of CSRs showed that considerably more relevant information on trial methods, primary outcomes, and some AE outcomes can be gained from CSRs . In the current study, information gain from CSRs versus publicly available sources was even higher for a full set of patient-relevant outcomes than for the limited set of trial outcomes investigated in our previous study. While the proportion of completely reported primary and AE outcomes in the previous study was 91% for CSRs, 52% for journal publications, and 71% for registry reports , the corresponding values for the full range of (primary and non-primary) patient-relevant outcomes investigated in the current study were 86%, 23%, and 22%, respectively.
Relevance of Full Trial Information for Everyday Patient Care
Our findings suggest that oseltamivir and reboxetine might not be the only cases in which conclusions on benefits and harms might be changed by making full information on all clinical trials available to independent researchers and subsequently to clinicians and patients. Access to CSRs would thus allow informed decision-making and directly influence patient care.
The goal of assessing the full information from CSRs is not only to determine the benefits and harms of a single drug, but also to investigate the position of a drug in the given therapeutic area. For this purpose, comparative effectiveness research is gaining momentum both in the US and in Europe [25,30]. This area of research would specifically benefit from full CSRs being publicly available. As direct comparisons of alternative treatment methods are not available for all comparative effectiveness research questions, indirect comparisons will become more important, and CSRs are essential sources to inform meaningful indirect comparisons. This is because, firstly, indirect comparisons require detailed information on methods (i.e., a full protocol) of the clinical trials of interest, as well as on the trial population, to assess whether indirect comparisons within a given pool of trials are appropriate at all; this type of information is available in CSRs. Secondly, indirect comparisons require full numerical information on all relevant outcomes for network meta-analyses; as our analyses show, such extensive information is provided only in CSRs.
How Can Full Access to Clinical Study Reports Be Achieved?
As stated, the EMA intends to proactively publish complete clinical trial data, possibly including CSRs, from January 2014 onwards , and for this purpose has held extensive consultations with advisory groups of stakeholders and other interested parties  and has published a draft policy . An even clearer solution would be a legal requirement to make CSRs publicly available, as is currently being discussed for the planned European legislation on clinical research .
However, both initiatives have a potential major flaw: they probably would apply only to drugs approved from January 2014 onwards, or for trials conducted after new legislation came into effect. This would present a problem because most drugs in current use would not be covered by the new measures, yet these drugs will still be widely used in clinical practice for years to come. Thus, although comprehensive information would in future be available for newer drugs, published information on the majority of drugs would still remain biased. This would hamper a meaningful comparison of alternative treatment methods. In addition, open questions about drugs in current use may never be answered. This is particularly relevant for drugs with a substantial public health impact, such as oseltamivir. The CSRs in our pool of trials were prepared between 1985 and 2010 and prove the value of CSRs for drugs in current use. The CSRs of such drugs that were submitted to regulatory authorities should therefore be made publicly available in a central repository to complete the evidence base. Pharmaceutical companies and non-industry trial sponsors could also release CSRs, thus underlining their commitment to transparency.
In line with our point of view, a further initiative to promote trial registration and reporting of full methods and results, the AllTrials initiative (http://www.alltrials.net/), also specifically refers to “past and present” clinical trials.
Further Rocks on the Road to Full Data Transparency
It should be noted that the full implementation of the new EMA policy is in jeopardy as the pharmaceutical industry, which has previously expressed its reservations about the policy , is taking legal action: two Freedom of Information requests were made to the EMA under its current data transparency policy to release individual patient data for adalimumab (Humira), a tumor necrosis factor inhibitor approved for rheumatoid arthritis and other indications. However, the company AbbVie has sought an injunction to block the EMA from releasing the data. A second company, Intermune, has also taken legal action against the EMA . The interim decision by the General Court of the European Union is in favor of the companies: the EMA has been ordered not to provide documents until a final ruling is given by the Court . The two court cases seem to represent not just the policy of single companies but a general industry strategy, since both European and US pharmaceutical industry bodies have lodged supportive pleas . This action contradicts repeated assertions by industry that it supports data transparency.
Our study has a number of limitations. First of all, we were not able to investigate a representative or random sample of CSRs, because these documents are usually not available outside pharmaceutical companies and regulatory agencies. Therefore, our sample was based on CSRs provided voluntarily by pharmaceutical companies upon request during our assessment procedures. We did not receive a (full) CSR for 62% (167/268) of the trials included in our HTAs and thus had to exclude these trials from our current study. The excluded and included trials showed differences in the therapeutic areas investigated (see Table S6), for example, the former comprised a higher proportion of trials on depression (57% versus 40%), but a lower proportion of trials on diabetes (21% versus 44%). In addition, a higher proportion of the excluded trials were reported in journal publications (76% versus 65%), whereas a lower proportion of these trials were reported in registry reports (17% versus 50%). It is unclear whether our results would have been different if they had been based on a random sample of CSRs. Furthermore, our sample covered only a limited number of therapeutic areas and was restricted to randomized controlled trials investigating drugs, so we cannot comment on other trial designs or trials of non-drug interventions. In addition, the registry reports included were generated by a limited number of pharmaceutical companies and were not prepared according to the requirements of the Food and Drug Administration Amendments Act ; future reports in ClinicalTrials.gov may be of better quality.
The dataset for our study was generated in 2011. We did not perform an update of the dataset as this would have required a major investment of resources. However, we believe that this dataset, which includes a total of 101 trials with 1,080 patient-relevant outcomes, is large enough to produce meaningful results.
We also note that several further issues related to CSRs could be investigated in future research. For example, except for the case of reboxetine , we can make statements only about the completeness of information in CSRs versus publicly available sources; we did not analyze how often the inclusion of data from CSRs changed the interpretation of the overall results of a study. Moreover, we did not investigate whether certain study characteristics (e.g., enrollment size) influenced completeness of reporting, nor did we specifically describe study protocols included in CSRs.
Information on patient-relevant outcomes investigated in clinical trials is insufficient in publicly available sources; considerably more information can be gained from CSRs. CSRs should be made publicly available as they may substantially influence conclusions concerning the actual position of an individual drug in a therapeutic area. Our findings underline the importance of CSRs—both for past and future trials—for unbiased trial evaluation, thus supporting informed decision-making in health care.
Pattern of reporting of trial outcomes in journal publications (sample: all trials with a CSR, n = 101).
Pattern of reporting of trial outcomes in registry reports (sample: all trials with a CSR; n = 101).
Analysis of completeness of information for trial outcomes in CSRs versus journal publications (sample: all trials with both a CSR and a journal publication; n = 65).
Analysis of completeness of information for trial outcomes in CSRs versus registry reports (sample: all trials with both a CSR and a registry report; n = 50).
Analysis of completeness of information for trial outcomes in CSRs versus the combination of journal publications and registry reports (sample: all trials with a CSR and a registry report and a journal publication; n = 29).
Characteristics of excluded trials and documents.
Conceived and designed the experiments: BW . Analyzed the data: VV UG . Wrote the first draft of the manuscript: NM BW NW . Contributed to the writing of the manuscript: BW NW NM MFK VV PK MK UG. ICMJE criteria for authorship read and met: BW NW NM MFK VV PK MK UG. Agree with manuscript results and conclusions: BW NW NM MFK VV PK MK UG. Extracted and coded the data: MFK NW PK. Checked the data and codings: MFK NW VV PK MK.
Conceived and designed the experiments: BW . Analyzed the data: VV UG . Wrote the first draft of the manuscript: NM BW NW . Contributed to the writing of the manuscript: BW NW NM MFK VV PK MK UG. ICMJE criteria for authorship read and met: BW NW NM MFK VV PK MK UG. Agree with manuscript results and conclusions: BW NW NM MFK VV PK MK UG. Extracted and coded the data: MFK NW PK. Checked the data and codings: MFK NW VV PK MK.
- 1. Song F, Parekh S, Hooper L, Loke YK, Ryder J, et al. (2010) Dissemination and publication of research findings: an updated review of related biases. Health Technol Assess 14: 1–220.
- 2. McGauran N, Wieseler B, Kreis J, Schuler YB, Kolsch H, et al. (2010) Reporting bias in medical research—a narrative review. Trials 11: 37.
- 3. De Angelis C, Drazen JM, Frizelle FA, Haug C, Hoey J, et al. (2004) Clinical trial registration: a statement from the International Committee of Medical Journal Editors. N Engl J Med 351: 1250–1251.
(2007) Food and Drug Administration Amendments Act of 2007. US Public Law 110-85 section 801. Washington (District of Columbia): Food and Drug Administration. Available: http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=110_cong_public_laws&docid=f:publ085.110.pdf. Accessed 23 May 2013.
- 5. Prayle AP, Hurley MN, Smyth AR (2012) Compliance with mandatory reporting of clinical trial results on ClinicalTrials.gov: cross sectional study. BMJ 344: d7373.
- 6. Kunath F, Grobe HR, Keck B, Rucker G, Wullich B, et al. (2011) Do urology journals enforce trial registration? A cross-sectional study of published trials. BMJ Open 1: e000430.
- 7. Mathieu S, Boutron I, Moher D, Altman DG, Ravaud P (2009) Comparison of registered and published primary outcomes in randomized controlled trials. JAMA 302: 977–984.
- 8. Huic M, Marusic M, Marusic A (2011) Completeness and changes in registered data and reporting bias of randomized controlled trials in ICMJE journals after trial registration policy. PLoS ONE 6: e25258
- 9. Turner EH (2008) Closing a loophole in the FDA Amendments Act. Science 322: 44–46.
- 10. Schulz KF, Altman DG, Moher D Group C (2010) CONSORT 2010 statement: updated guidelines for reporting parallel group randomized trials. Ann Intern Med 152: 726–732.
International Conference on Harmonisation (1996) Guideline for industry: structure and content of clinical study reports. Available: http://www.fda.gov/downloads/regulatoryinformation/guidances/ucm129456.pdf. Accessed 23 May 2013.
- 12. Doshi P, Jefferson T, Del Mar C (2012) The imperative to share clinical study reports: recommendations from the Tamiflu experience. PLoS Med 9: e1001201
- 13. Eyding D, Lelgemann M, Grouven U, Harter M, Kromp M, et al. (2010) Reboxetine for acute treatment of major depression: systematic review and meta-analysis of published and unpublished placebo and selective serotonin reuptake inhibitor controlled trials. BMJ 341: c4737.
- 14. Gotzsche PC, Jorgensen AW (2011) Opening up data at the European Medicines Agency. BMJ 342: d2686.
- 15. Doshi P (2009) Neuraminidase inhibitors—the story behind the Cochrane review. BMJ 339: b5164.
- 16. Wieseler B, McGauran N, Kaiser T (2010) Finding studies on reboxetine: a tale of hide and seek. BMJ 341: c4942.
- 17. Eichler HG, Abadie E, Breckenridge A, Leufkens H, Rasi H (2012) Open clinical trial data for all? A view from regulators. PLoS Med 9: e1001202
European Medicines Agency (2012) Release of data from clinical trials. Available: http://www.emea.europa.eu/ema/index.jsp?curl=pages/special_topics/general/general_content_000555.jsp&mid=WC0b01ac0580607bfa. Accessed 23 May 2013.
European Medicines Agency (2013 Apr 30) European Medicines Agency publishes final advice from clinical-trial advisory groups. Available: http://www.ema.europa.eu/ema/index.jsp?curl=pages/news_and_events/news/2013/04/news_detail_001778.jsp&mid=WC0b01ac058004d5c1. Accessed 23 May 2013.
European Medicines Agency (2013) Draft policy 70: publication and access to clinical-trial data. Available: http://www.emea.europa.eu/ema/index.jsp?curl=pages/includes/document/document_detail.jsp?webContentId=WC500144730&mid=WC0b01ac058009a3dc. Accessed 18 July 2013.
- 21. Watson R (2012) European commission proposes new laws to halt decline in number of clinical trials. BMJ 345: e4901.
European Parliament (2013) ***I report on the proposal for a regulation of the European Parliament and of the Council on clinical trials on medicinal products for human use, and repealing Directive 2001/20/EC (COM(2012)0369 – C7-0194/2012 – 2012/0192(COD)). Available: http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2f%2fEP%2f%2fNONSGML%2bREPORT%2bA7-2013-0208%2b0%2bDOC%2bPDF%2bV0%2f%2fEN. Accessed 18 July 2013.
- 23. Wieseler B, Kerekes MF, Vervoelgyi V, McGauran N, Kaiser T (2012) Impact of document type on reporting quality of clinical drug trials: a comparison of registry reports, clinical study reports, and journal publications. BMJ 344: d8141.
Higgins JPT, Green S (2011) Cochrane handbook for systematic reviews of interventions, version 5.1.0. New York: Wiley.
European Network for Health Technology Assessment (2011) EUnetHTA JA WP5: relative effectiveness assessment (REA) of pharmaceuticals. Available: http://www.eunethta.eu/sites/5026.fedimbo.belgium.be/files/Final%20version%20of%20Background%20Review%20on%20Relative%20Effectiveness%20Assessmentappendix.pdf. Accessed 23 May 2013.
- 26. Sox HC, Helfand M, Grimshaw J, Dickersin K, editors. the PLoS Medicine, et al. (2010) Comparative effectiveness research: challenges for medical journals. PLoS Med 7: e1000269
- 27. Doshi P, Jefferson T (2013) Clinical study reports of randomised controlled trials: an exploratory review of previously confidential industry reports. BMJ Open 3: e002496.
- 28. Vedula SS, Bero L, Scherer RW, Dickersin K (2009) Outcome reporting in industry-sponsored trials of gabapentin for off-label use. N Engl J Med 361: 1963–1971.
- 29. Vedula SS, Li T, Dickersin K (2013) Differences in reporting of analyses in internal company documents versus published trial reports: comparisons in industry-sponsored trials in off-label uses of gabapentin. PLoS Med 10: e1001378
- 30. Sox HC (2010) Comparative effectiveness research: a progress report. Ann Intern Med 153: 469–472.
- 31. Barbour V, Clark J, Connell L, Ross A, Simpson P, et al. (2013) Getting more generous with the truth: clinical trial reporting in 2013 and beyond. PLoS Med 10: e1001379
- 32. Jack A (2013 Mar 10) Pharma group sues European regulator over data. Financial Times
European Medicines Agency (2013 Apr 30) European Medicines Agency receives interim decisions of the General Court of the EU on access to clinical and non-clinical information. Available: http://www.ema.europa.eu/ema/index.jsp?curl=pages/news_and_events/news/2013/04/news_detail_001779.jsp&mid=WC0b01ac058004d5c1. Accessed 23 May 2013.