COMPare – The CEBM Outcome Monitoring Project

Posted on February 4, 2016

Tags: , , , , ,

Students tracking outcome switching in clinical trials

Henry Drysdale and Aaron Dale

Outcome switching is a major problem in clinical trial reporting that seriously distorts the evidence on which we make clinical decisions. In short, this is when trialists fail to report pre-specified outcomes, add in novel outcomes, and do not declare this in the trial report. By leaving outcome switching undisclosed, journals allow, either by error or design, the possibility that interventions appear better than they actually are. This misinforms doctors and risks considerable harm to patients.

We know that outcome switching remains highly prevalent in the medical literature. A recent systematic review found a median proportion of 31% (IQR 17- 45%) of trials had discrepancies between the registered and published primary outcomes [1]. However, prevalence studies alone have clearly been unable to eradicate outcome switching, which led us to ask: what will? We believe that detailed assessments of specific trials, open sharing of results, and individual accountability for the trialists and journals involved will help to fix this flaw in evidence-based medicine.

COMPare (the CEBM Outcome Monitoring Project) is a team of 5 medical students, 3 senior researchers in evidence-based medicine, and an exceptionally talented programmer. For 6 weeks, we monitored all trials published in the top 5 general medical journals for outcome switching. We compared the pre-specified outcomes in the trial protocol or registry entry with those reported in the journal publication. We then recorded the number of missing pre-specified outcomes and the number of new outcomes silently added, and wrote a letter to the journal for each trial with misreported outcomes. We also published all our results, raw data and letters to journals on our website – COMPare-trials.org.

We have been amazed at our results so far. To date, we have assessed a total of 67 trials, and only 9 have correctly reported their outcomes. All of the rest contain outcome switching, with a total of 301 unreported pre-specified outcomes, and 357 novel outcomes that were silently added. Interestingly, these numbers have risen together and remained roughly equal, presenting a picture of true outcome “switching”; the problem of missing outcomes is as prevalent as the problem of reporting undeclared novel outcomes. These switches are not declared, legitimate adjustments to protocols or trivial changes to wording, but major undeclared additions or omissions that decrease the quality of evidence presented, without the reader ever knowing.

The responses we’ve had from journals have been fascinating. They have varied from open discussion and transparent reporting of discrepancies, to rejection of our letters and a refusal to engage with the problem. Annals of Internal Medicine have published one of our letters in print [2], alongside a factually incorrect critique of our methods [3], with no attempt to correct the record [4]; not exactly good science, especially coming from the 4th highest impact factor journal in medicine. On the other hand the BMJ, whilst not always publishing perfect trials, has been posting our letters, and recently published a correction to the REEACT trial based on our analysis [5]. This is exactly what we need and expect from major medical journals – transparent reporting, and open sharing of errors to correct the public record.

What has been particularly inspiring for us has been the impact we have been able to make as medical students by simply auditing and reporting this problem in real time. We have learned that the solution to problems in evidence-based medicine can be achieved by anyone in the medical profession regardless of prior experience or seniority. The key for us has been the clear recognition of a problem, a systematic approach to its analysis, and open communication of our findings.

In December we came to the end of 6 weeks of painstaking analysis; phase 1 of the COMPare project. For phase 2, we are blogging on individual trials, responding to journal and authors, and openly sharing the stories. We are also writing a paper with a full analysis of our results, including the prevalence of outcome switching in each individual journal, the publication status of the letters, and details of direct responses from journals. So please share our results, check them yourselves, get in touch with feedback, and discuss COMPare with anyone who will listen. We will be continually updating our website with news and results, blogging relentlessly, and shouting our findings from the rooftops.

References

[1] Jones CW, Keil LG, Holland WC, Caughey MC, Platts-Mills TF. Comparison of registered and published outcomes in randomized controlled trials: a systematic review. BMC Med. 2015;13:282. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4650202/

[2] Slade E, Drysdale H, Goldacre B, Discrepancies Between Prespecified and Reported Outcomes, Ann Intern Med. Published online 22 December 2015. Available from: http://www.bmj.com/content/351/bmj.h5627/rr-12

[3] Annals Editors, Discrepancies Between Prespecified and Reported Outcomes, Ann Intern Med. Published online 22 December 2015. Available from: http://annals.org/article.aspx?articleid=2478526

[4] COMPare blog post: http://compare-trials.org/blog/where-does-annals-of-internal-medicine-stand-on-outcome-switching-a-detailed-response/, last accessed 20/01/2016

[5] Correction on the REEACT trial, published 12/01/2016: http://www.bmj.com/content/352/bmj.i195, last accessed 20/01/2016

Henry Drysdale and Aaron Dale

Henry Drysdale is a graduate-entry medical student at St Anne’s College, Oxford. He graduated with a first class BSc in Physics from Imperial College London, where he specialised in medical imaging and computational physics. He is also a Physics and Maths tutor for A-Level and undergraduate students in Oxford. Henry has a special interest in anesthetics, in which he plans to pursue a clinical and academic career. He is also passionate about improving the quality of evidence on which clinical decisions are made. ——————————————————————————————-

Aaron Dale is a graduate-entry medicine student at Green Templeton College, University of Oxford. He holds a BA in Natural Sciences and an MSci in Biochemistry from Churchill College, University of Cambridge. Aaron completed an MRC Capacity-Building PhD Studentship in drug discovery at The School of Pharmacy, University College London for work on the symmetric bis-benzimidazole series of compounds as potential anti-microbial agents. He has personally volunteered as a patient in several phase 1 clinical trials and has a keen interest in improving the standards of clinical trial reporting to build a stronger evidence base for medical treatments. He has contributed to several articles in the Student BMJ on how to improve evidence-based medicine.

More Posts - Website

Related Post

creative commons license
COMPare – The CEBM Outcome Monitoring Project by Henry Drysdale and Aaron Dale is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Unless otherwise stated, all images used within the blog are not available for reuse or republication as they are purchased for Students 4 Best Evidence from shutterstock.com.

One thought on “COMPare – The CEBM Outcome Monitoring Project

  1. Pingback: Introduction | notdrdalephdblog

Leave a Reply

Your email address will not be published. Required fields are marked *