Rethinking evidence

UBC-campus-Plaza1.jpg

 

We are now embroiled in a research methods class. As good grad students, we’re tasked with reflecting on our experiences with research (or actually “reflecting on ourselves reflecting….which is all a bit meta for me). It was a useful exercise before we start trying on some theories and approaches for size over the next year. My instinctive “what does research mean to you” landing spot was evidence based medicine/practice. As a radiation therapist (and allied health professional) EBM is infused in our work. We all know the ubiquitous evidence pyramid crowned by the king of evidence, the meta-analysis. The ultimate distillation of all that lovely hard data from those meticulously carried out randomised controlled trials (RCTs). EBM – coupled with the patient’s preferences and the physician’s clinical judgement will deliver beautiful, neutral, quantitative, value-free answers to all our tricky questions.

Except, of course, it really doesn’t. EBM gives us a set of rules that work sometimes, for a certain group of people. And what do we mean by evidence? What about other, less quantifiable factors? What about qualitative research? How, exactly, do we incorporate the patient and provider’s viewpoints and experience? Even as I wrote about the valorisation of EBM – I felt a creeping sense of unease. There was no way I was going to get away with this, in a university full of post-positivists and in a classroom full of educators.

And….even a cursory look around reveals some interesting counter arguments. Firstly, there’s the feminist lens that challenges the positivistic concept of a true detached and neutral observer of events.  Indeed what we mean by “observer” is arguably limited to a small group of educated and privileged white men. Empirical epistemology (foundational to EBM) reduces gender and other differences to “bias” which excludes and perhaps even harms women. Female research also over-focuses on reproductive related issues and fails to investigate gender-dimensions of other illnesses (such as HIV/AIDs research, heart disease, depression and TB). There is a well-documented male enrollment bias in clinical trials. How much faith would you put in that evidence-based cardiac disease guideline for a 50-something female now?

My old friend phenomenology also has something to say. The argument rests on the way we divide people into mind and body – science mostly concerned with fixing the body. The so called subjective features of illness are usually deemed unimportant by clinicians but are vital to the person’s experience of their illness. Treating the patient, not the disease, would actually be “patient centred care” and the person’s experiences, stories and ways of seeing the world would all become important parts of the diagnosis. This type of “evidence”, however, has little place in EBM, despite the development of “shared” decision making and other attempts to incorporate patient’s values. The evidence that is given precedence (RCTs and meta-analyses) might not be as reliable as we think; there are a number of biases inherent in both that include (ironically) a lack of evidence of efficacy, and a limited usefulness for individual (not aggregated) patients. Additionally – methods that might pick up phenomenological factors (qualitative, for example) have no place in the EBM hierarchy. I love the recent paper by Eakin, where she states:

“it is indeed transgressive to practice qualitative research within the medical and health sciences – a land in which the randomised controlled trial (RCT) is considered the apex of the methodological food chain, and where evidence based practice (EBP), a creed anchored in quantitative measurement and epidemiological reasoning, has been widely embraced across the clinical professions” (p. 107).

The readings and discussions we’ve had in class have solidified my feeling that EBM is a useful tool, but also that it “does not increase objectivity, but rather obscures the subjective elements that inescapably enter all forms of human inquiry” (Goldenberg, 2006. P. 2631). As health care professionals, when we claim that we need to make decisions based on “evidence” – let’s be careful about what evidence we mean and why. Evidence-based and best practice are not always the same thing.

i_love_evidence_based_medicine_classic_white_coffee_mug-rb0c4de8eed344ab0ae1537f623fccad3_x7jgr_8byvr_324

 

Informed by these readings (if you want one, pick Goldenberg!)

Cohen, A. M., Stavri, P. Z., & Hersh, W. R. (2004). A categorization and analysis of the criticisms of Evidence-Based Medicine. International Journal of Medical Informatics, 73, 35–43. http://doi.org/10.1016/j.ijmedinf.2003.11.002

Eakin, J. M. (2016). Educating critical qualitative health researchers in the land of the randomised clinical trial”. Qualitative Inquiry, 22(2), 107- 118.  doi:10.1177/1077800415617207

Goldenberg, M. J. (2006). On evidence and evidence-based medicine : Lessons from the philosophy of science, 62, 2621–2632. http://doi.org/10.1016/j.socscimed.2005.11.031

Greenhalgh, T. (1999). Narrative based medicine: narrative based medicine in an evidence based world. BMJ (Clinical Research Ed.), 318(7179), 323–325. http://doi.org/10.1136/bmj.318.7179.323

Rogers, W. (2004). Evidence-based medicine and women: do the principles and practice of EBM further women’s health? Bioethics, 18(1), 50–71. http://doi.org/10.1111/j.1467-8519.2004.00378.x

Advertisements

3 thoughts on “Rethinking evidence

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s