Reflections on David Sackett's time at the Centre for Evidence-Based Medicine
1 June 2015
Research reviews & expert opinions
David Sackett, the founder of the Centre for Evidence-Based Medicine (CEBM), died Wednesday, May 13, at age 80. He spent 5 years in Oxford but had a tremendous impact that continues to this day. In 1994, as is well documented, Dave Sackett moved to Oxford to take up the post of Director of the Centre for Evidence-Based Medicine and Professor of Evidence-Based Medicine at the University of Oxford.
The post came about largely as a result of Sir Muir Gray’s intervention; at the time he was head of Research and Development for the Oxford East Anglian Region of England. In July 1994 Dave moved to Oxford, a time in Oxford that saw the initiation of the Cochrane Collaboration under the leadership of Sir Iain Chalmers, the Centre for Statistics of Medicine led by Doug Altman and the home to the Clinical Trials Service unit led by Richard Peto and Rory Collins.
The NHS R&D Centre for Evidence-Based Medicine, as it was called at the time, was opened in March 1995. The offices were on the 5th floor of the John Radcliffe Hospital, next to the toilets, which later became more office space for visitors to the centre.
The initial remit of the centre was twofold:
1. To support the teaching and practice of evidence based health care (‘EBHC’) throughout the UK and Europe.
2. To effect the creation of formal graduate education in the conduct of randomised controlled trials and systematic reviews at the University of Oxford.
The initial prospectus also included two further aims a copy of which can be downloaded: 1996 CEBM prospectus:
3. To conduct applied, patient-based and methodological research in order to generate the new knowledge required for the practice of evidence-based health care
4. To collaborate with other scientists in the creation of a Graduate Programme to train researchers to perform randomised trials and systematic reviews.
As part of the promotion of EBM Dave did a substantial number of talks – about one a week. He did that many talks his first powerpoint slide had date and place to be added as he moved around Europe, and his introduction would go a little like this:
Done: I’ve gotten out of date and retrained in Internal Medicine twice
Do: I run an in-patient General Medicine service (all comers) at a UK District General Hospital:
» 208 admissions last month
» strive to use evidence at the bedside
Don’t: I’ve cancelled my journal subscriptions
In 1996 the BMJ published a lead editorial by Dave on Evidence-Based Medicine: what it is and what it isn’t: (at the time of writing this) Google scholar has 10,805 citations for this article, which delineates the practice of EBM as the integration of individual clinical expertise with the best available external clinical evidence from systematic research and patient’s values and expectations. At the time, and I don’t think much has changed, we needed to use evidence about 5 times for every in-patients and twice for every 3 out-patient. Despite what was needed, we got less than a third of it. Dave was well aware that to use evidence at the bedside, we needed to access it within seconds. As a result the evidence cart was born.
Strange how simple ideas capture the imagination, but folk would travel from abroad to see what was happening on the ward round. At times it got so hectic they would have to travel in separate lifts as the team moved between floors. The contents of the cart included: an infra-red simultaneous stethoscope with 12 remote receivers to allow junior doctors to share the experience of listening to, for example, an unusual heart sound; textbooks on making physical diagnosis text book and reprints of the evidence based (JAMA Rational Clinical Exam series); a laptop to search PubMed, a notebook computer and projector and pop-out screen to display the results of the evidence searches and a rapid printer for the junior doctors and the med students, and at times the patients, to take information away.
In the month before the cart arrived an audit of 72 clinical cases that needed a search for evidence of their optimal management revealed only 19 (26%) of searches were actually carried out. After its arrival, the evidence cart was used 98 times over a month to search for and use evidence – alongside clinical judgement – to try and improve patient care. An audit of these searches revealed that 81% were for evidence that could affect diagnostic and/or treatment decisions and, more impressively, 90% of these searches were successful in finding useful evidence that was used, in a timely manner, to make decisions with the patient about their care. Further details of the audit revealed that of these searches 52% confirmed diagnostic and/or management decisions, 23% led to changes in existing decisions and 25% led to additional decisions. When the cart was removed from the clinical wards an audit revealed that of 41 cases where a search for evidence should have been carried out, only 5 (12%) were.
A 1995 Lancet paper analysed the treatments given to all 109 emergency admission patients managed during a one month period. At the time the consultant teams were split into 3 teams: firm A, B and C. As part of firm A his team were on take for ten shifts of 12 hours per month seeing on average 25 to 30 patients per take. Of the 109 treatments 82% were classified as evidence-based. Dave and his team noted that the majority of patients were offered (and accepted) evidence-based interventions. They also noted that over half of those patients were receiving treatments where the evidence had shown more harm than benefit.
The Evidence Cart
Not everything was plain sailing. I’d say there was about a 50/50 split amongst the clinical staff, but amongst the juniors there was more than the fair share of support to make up for the dissent. A 1995 Lancet editorial, with no assigned author, was one attempt to try and put ‘Evidence-Based Medicine in its place.’ To which Dave, of course, responded to.
EBM Journal, 1st edition
In 1995 the EBM journal was launched and quickly became a huge success. it was based on searching the top journals for those articles that were most likely to change practice and then summarising their content.
Along with the ACP journal it was right for its time, as it tried to overcome the barriers of the ever increasing numbers of RCT published each year. Each article was rated by reviewers, in those days by snail mail, for its relevance to clinical practice and its newsworthiness.
Teaching and mentoring
Despite using his time to use evidence on the wards, develop EBM journals, give talks and a bunch of other stuff (when he left the centre he had to send over 100 emails resigning from various committees and activities), one of Dave’s biggest passion was his commitment to teaching and mentoring. Part of Dave’s teaching philosophy was based around fostering enquiry and learner engagement. As an example, educational prescriptions were given out on post take ward rounds to all members of the team, including medical students with a view to be answered later in the week. In fact those students who weren’t on Dave’s team often sought out members of his team for the latest questions and answers. Doug Badenoch, went on to develop the catmaker software, revolutionary at the time, to facilitate critically appraised answers, which could be added to the CART’s redbook. A 1998 editorial by Dave, in the CMAJ, pointed out that in terms of teaching critical appraisal: there are no quick fixes.
Within about a month of his arrival I first met Dave at one of his teaching sessions – see the YouTube clip: a Pre-Clinical Course in Biostatistics teaching session for medical students.
I have also found Dave’s articles on mentoring incredibly useful- compulsory reading for clinical epidemiologists – and I refer to them many times: becoming a successful clinician investigator.
a couple of noteworthy examples in this article include:
“To become a professor of medicine or surgery now you have to be young, impossibly specialised to the point of non-functionality in any clinical reality zone, and skilled either in the treatment of rats and cats or in plagirising other people’s research through meta-analysis.”
“I don’t believe that academics ever outgrow their need for mentoring. As you become an established investigator, you’ll require gentle confrontation about whether you are becoming a recognised “expert” and taking on the bad habits that inevitably accompany that state.”
Part of this piece forms the material in the excellent clinical trialist rounds series. The last of which, well I think it’s the last, was accepted online in Jan 2015: Clinician-trialist rounds: 27. Sabbaticals. I’m taking a sabbatical! How should I prepare for it? The mentoring part of this series was published in 2011/2012 and covers four articles. [1-4]
Dave also took a pretty dim view of seniors who took juniors to task: In his 2014/15 documented interviews he stated: ‘As was my practice when my mentees were attacked by senior professors, I hauled the author into my office, told him what I thought of his behavior, and threatened to throw him down the stairs if he ever repeated it. He didn’t.’ Which goes with his philosophy of promoting evidence over eminence: possibly best exemplified by his XMAS BMJ article with Andy Oxman on HARLOT plc: an amalgamation of the world’s two oldest professions: ‘Tired of being good but poor, the authors have amalgamated the world’s two oldest professions in a new niche company, HARLOT plc, specialising in How to Achieve positive Results without actually Lying to Overcome the Truth.”
In a 2003 editorial in the CMAJ: The arrogance of preventive medicine Dave further aired his views on experts who “refuse to learn from history until they make it themselves, and the price for their arrogance is paid by the innocent.” Further to this, he wrote a personal view on The sins of expertness and a proposal for redemption, underpinning his belief that after 15 years in one area of research you should move on as you start to stifle innovation: “at other times, the expert bias against new ideas is unconscious. The result is the same: new ideas and new investigators are thwarted by experts, and progress toward the truth is slowed.”
One of the highlights of the year was the Teaching EBM workshop; and now in its 20th year it is still is one of the high points. To date there have been approximately 1500 teachers of EBM (my estimates is from about 50 countries) who have attended this workshop alone, and at times there have been around 100 attendees on one course. These original workshops set the tone for the ensuing program of courses: Most of the Workshop were conducted in small groups in which participants developed and test (using role-playing and other pedagogic strategies) their own EBM packages. Plenary sessions presented and discussed general issues in planning, executing, and evaluating EBM, and often demonstrated large-group strategies for teaching EBM. Searching and study time was provided each day, and informal social events encouraged free discussion and the establishment of ongoing linkages and working groups.
There have been many visitors to the centre it is hard to know where to start. But in those days a few names are worth mentioning, and I am sorry if I have missed anyone out. Sharon Strauss came on a 3 year fellowship and participated fully in ward rounds, research and writing many articles on EBM. Scott Richardson arrived, duly broke his arm, and went onto write the well-built clinical question: a key to evidence-based decisions, published in the ACP journal. Together with Brian Haynes and William Rosenberg they published the 1st edition of Evidence-based Medicine: How to Practice and Teach EBM. Rod Jackson, whom I met at the 2nd EBM workshop, lived on Holywell St. for a year whilst he was on sabbatical.
Original CEBM website 1996
So how will I best remember Dave? The first time I met him in his office we discussed Numbers Needed to Treat for aspirin and preventing death from myocardial infarction. Within two weeks I was invited to my first ever workshop on Teaching EBM, and was duly tasked with populating the centre’s new website. The last time, a few months before his death was an email conversation about biases and propensity scores with a number of other colleagues.
I write requesting a brief comment/education about propensity scoring vis a vis the Coronary Drug Project.
When I tell folks the 4 reasons why I’ve been doing RCTs for the past 46 years, one of them is the suspicion I developed as a Medical Resident that highly compliant patients who followed all my advice and took all their pills were destined for better prognoses before they ever fell into my clutches.
This suspicion was greatly strengthened when Paul Canner [attached] looked at mortality within the placebo group in the Coronary Drug Project (ironically, I’d entered patients into this trial when I was a Buffalo House officer).
For purposes of this note, still more striking to me was his finding that this relationship twixt compliance and mortality was maintained
even after he ‘adjusted’ for 40 baseline characteristics.
This latter has made it tough for me to accept propensity scoring as a means of overcoming this sort of bias.
Could you drop me a brief note telling me whether my skepticism is warranted, or why/how I have simply missed the boat somewhere along the way?
Thanks very much for your time.
Cheers Dave Sackett
And the 20 years in between? A lot on biases and anything and everything to do with EBM, and then some more on biases, with a heap of fun added for good measure.
There is much I have missed out – CARE COAD diagnostic project, levels of evidence, OCCAMS for medical students and much more – and there are many folk I have overlooked, for that I apologize. But if you would like to leave a comment, I look forward to your thoughts, and I’ll periodically update this article with any comments and suggestions.
Obituaries that we like and tell more of Dave’s achievements
- Drummond Rennie in the Guardian
- Richard Smith in the BMJ
- The Telegraph
- Andre Picard the globe and mail
- McMaster tribute to David Sackett
- Clinician-trialist rounds: 10. Mentoring – part 4: attributes of an effective mentor.
- Clinician-trialist rounds: 9. Mentoring – part 3: the structure and function of effective mentoring: advice and protection.
- Clinician-trialist rounds: 8. Mentoring – part 2: the structure and function of effective mentoring linkage, resources, and academic opportunities.
- Clinician-trialist rounds: 7. Mentoring: why every clinician-trialist needs to get mentored.