Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Director of the new MSc in EBHC Teaching and Education, David Nunan, reflects on evidence-based medicine (EBM) in relation to the COVID-19 pandemic and proposes what the evolution of EBM might look like.

Close-up of COVI-19 vaccine and injectables

A mission statement for the next chapter in the evolution of EBM

When Guyatt and Sackett termed “Evidence-Based Medicine” in 1991, it was in the background and context of a piece-meal approach to the use of evidentiary rules to support decision-making in medical practice. It wasn’t that research evidence was not being used in clinical decisions - it was the selective and biased use, often after the fact. 

“EBM” was built on three fundamental epistemological pillars; first being that not all evidence is created equal - the practice of medicine should be based on the best available evidence for a given problem; second is that only by evaluating the totality of the evidence, not selecting evidence that favours a particular claim, can we arrive at the “truth”; third is that evidence is necessary but not sufficient for effective decision making - human values, principles and priorities within the given environment and context always interplay with the evidence.

A fundamental aim of introducing the term EBM was to enact a paradigm shift in the way medicine was being taught. And to focus on the next generation. So what of the next generation now? The Covid-19 pandemic has undoubtedly been the paradigm shift to end all others. It has produced the best and worst in “evidence-based” decision making. Examples of ‘the best’ include the RECOVERY, SOLIDARITY and PRINCIPLE trials - rapid, high-quality evidence acquisition overcoming the usual red-tape that comes with clinical trials. Acting on evidence not-fit-for purpose and the lack of acknowledgement of uncertainty are candidates for ‘the worst’ - Hydroxychloroquine, Ivermectin, and other drug cocktails as examples of the former; the role out of non-drug interventions without robust evaluation the latter.

Alongside “You’re on mute”, “Follow the science,” is the phrase I most associate with the pandemic. What exactly does that mean? To some, it means that science is essential to making good and rational decisions. I’d agree but would go a step further. Following the science also means impartially, fairly and transparently judging its merits and weaknesses. The methodology of any given scientific study or evidence is where the merits and weaknesses are found.

A fundamental error during the pandemic has been a lack of transparency in the decision-making process. I defy anyone to find examples of good practice - transparent records of the processes by which evidence was gathered; what was sought (and what was not) and why?; if experts were used - who, how and why?; clear justifications for interpretations of the evidence from whoever provides them; how conflicting views and interpretations of evidence were handled; processes by which the impact of decisions would be assessed and lessons learnt. 

As an educator of EBM, I believe we have a rare position amongst many during the pandemic. Like many we are subjected to interventions and decisions enforced upon our practice, each informed to varying degrees by the available evidence (“follow the science”). Unlike many others, our practice happens to be educating the skills for better understanding the very evidence underpinning (or not) these interventions and decisions. 

The pandemic has revealed the need for a re-boot of EBM and how we teach it. We need an EBM 2.0. Here’s what I think this looks like.

  • Health care systems are not what they once were. Activities previously performed in silos of clinical specialists are now carried out in an interdisciplinary model and across a growing spectrum of healthcare professionals. The first thing EBM 2.0 needs is a change of name to encompass this broader remit. Evidence-Based Health Care (EBHC) does that nicely.

  • “Medicine” isn’t the only name change needed. Recall the third pillar of EBM (sorry, EBHC) - evidence alone is not enough. Good decision making means being informed by the best available evidence, not slavish obedience to it. It means evidence forms part of the discussions but decisions can go against the evidence. As others have said, but have not been listened to, we are in the era of “evidence-informed” health care (EIHC).

  • The evidence ecosystem has changed. The days of the evidence cart long behind us. Preprints have led the way in informing (and misinforming) during the pandemic. “Peer-reviewed” has also come under scrutiny - revisiting age-old arguments both for and against. This is also the era of Open Science - transparency of methods and sharing of data taking centre stage. Registered reports offer a solution to some of the problems in academic publishing. EIHC should lead the way in embracing and supporting developments in this space.

  • The pandemic tested the evidence eco-system like never before. How we produce evidence was pushed into overdrive. Systematic reviews and practice guidelines went from dead/dying documents to “living”, reviews had to become rapid by default (some arguing this always should have been the case). The procedural burdens of randomising patients to truly test intervention effectiveness using new trial designs were overcome in many, but not all, health care systems. The combination of open science and big data brought us award-winning projects answering key Covid-19 uncertainties. New methods mean new skills in how to conduct, appraise and report such studies, and a better understanding of their role and application in EIHC.

  • On the flip side, publishing of flawed research peaked with Ivermectin (read part of the story here), and raised questions about methods for conducting systematic reviews and meta-analyses. We don’t currently run data reliability checks as part of the systematic review process. EBM 2.0 needs to consider if we should, and if so, how?

  • The pandemic gave rise to the worlds greatest Infodemic. Misinformation is rife. But policing and censorship is not the answer. The boundary between debatable unknowns and uncertainty versus misinformation can become very blurred. What would be considered misinformation today becomes tomorrow’s “evidence-based”. The answer is EIHC - confronting, discussing, debating, and refuting ideas based on its core principles - transparent judgements of evidence in context (though social media organisations should be prohibited/regulated from measuring and capturing attention).

  • Hand in hand with tackling the infodemic is the communication of evidence. This again showed the best and worst. The worst for me was the lack of communication of uncertainty. Here I am not talking about acknowledging uncertainty. The need to “express uncertainty” was stressed by many, but far too few gave examples of exactly how. The best came in the form of efforts to empirically demonstrate that communicating uncertainty doesn’t impact trustworthiness. The EIHC community must build on this and do better with demonstrating how to informatively and accessibly express uncertainty.

  • Qualitative research answers questions about “Why?”. Health care is full of such questions - but EBM has arguably focused too much on the “What?”. EBM 2.0 needs to consider the greater role of qualitative research to answer health care questions, better understand its diverse methods (we don’t just have “quantitative” research) and how to better incorporate these data into EIHC.

  • EBM 2.0 should partner with other disciplines (such as epidemiology and economics) to better understand the role of causal inference methods outside of randomization and cognitive and decision sciences to generate a more coherent theory of healthcare decision making. However, the job of these fields is to make themselves, and their science, accessible to EIHC - patients and the public don’t afford us time to get lost in epistemological rabbit-holes.

  • The pandemic has highlighted the importance of health science and research that is efficiently produced, evaluated and acted upon. Hand-in-hand with this comes the ability to ask, find, appraise, and apply evidence to support effective decisions. These abilities are essential skills – skills that need to be taught. EBM 1.0 focused on the first three steps (ask, find, appraise). EBM 2.0 must turn the focus to steps four (apply) and five (assess), as well as step zero (recognise our uncertainties).

About the author:

David Nunan - Director of the PGCert in Teaching Evidence-Based Health Care and the New MSc in EBHC Teaching and Education.