Natural Health News — Two recent comments in the scientific press have highlighted the unreliability of so-called ‘evidence-based medicine’.
In one article in Perspectives on Psychological Science, a journal published by the Association for Psychological Science, the authors say that 21st century demands for quantity over quality have radically changed the way researchers ‘do’ science – for the worse.
Focusing on trends in psychological research, the authors, Marco Bertamini of the University of Liverpool and Marcus Munafò of the University of Bristol, have called the trend “bite-size science”– papers based on one or a few studies and small sample sizes and with little or no context or reference to previous work.
They say the demand for quick turnover often means smaller sample sizes. And, in their opinion, the smaller the experimental sample the greater the statistical deviations– that is, the greater the inaccuracy of the findings.
Playing on ignorance
Likewise strict word limits, increasingly imposed by publications, mean cutting the details about previous research. Because of this the published results can sound not only surprising but novel. The authors note, ironically, that: “A bit of ignorance helps in discovering ‘new’ things.”
These surprising, “novel” results are exactly what editors find exciting and newsworthy and what even the best journals seek to publish, they say. The mainstream media pick up the “hot” stories, and its journalists rarely do the background checks to confirm whether a result is new or not. This is how misconceptions and wrong results about medicine and health proliferate.
“We’re not against concision,” says Bertamini. “But there are real risks in this trend toward shorter papers. The main risk is the increased rates of false alarms that are likely to be associated with papers based on less data.”
In the age of twitter, facebook and dwindling attention spans, proponents argue that shorter science papers are easier to read.
Perhaps, say the authors, but more articles mean more to keep up with, more reviewing and editing – but not less work.
Proponents say authors gain increased recognition and influence from publishing more papers. Perhaps say Bertamini and Munafò, again. But two short papers may not have twice the scientific value of a longer one. Indeed, because of inattention to detail they might add up to less.
“Scientists are skeptics by training,” says Bertamini. But the trend toward bite-size science leaves no time or space for that crucial caution. And that, argue the authors, is antithetical to good science.
Missing data
Elsewhere experts in the British Medical Journal have called for an end to “incomplete data disclosure” and more robust regulation of information in clinical trials. The British Medical Journal website has released several papers investigating the issues of unpublished evidence – that is the data that gets left out of papers published in medical journals – to support their argument.
The editorial written by Dr Richard Lehman, of Oxford University, and clinical epidemiology editor of the BMJ, Dr Elizabeth Loder, notes that:
“Clinical medicine involves making decisions under uncertainty. Clinical research aims to reduce this uncertainty, usually by performing experiments on groups of people who consent to run the risks of such trials in the belief that the resulting knowledge will benefit others.”
However, what Lehman and Loder call a “culture of haphazard publication” – where inconvenient or unresolvable details are left out – means that decisions are not made using the best evidence.
They add: “Most clinicians assume that the complex regulatory systems that govern human research ensure that this knowledge is relevant, reliable and properly disseminated. It generally comes as a shock to clinicians, and certainly to the public, to learn that this is far from the case.”
Unlikely to be beneficial?
Shocking yes, unless you have been paying attention all along. For instance, the BMJ’s Clinical Evidence group, which is engaged in reviewing the clinical effectiveness of medical procedures, shows that 66% of all treatments fall into the categories: ‘Trade off between benefits and harms’, ‘Unlikely to be beneficial’, ‘Likely to be ineffective or harmful’ and of ‘Unknown effectiveness’.
What is interesting about these critiques of so-called evidence-based medicine from a natural health perspective is the exposed hypocrisy. Many opponents of complementary and alternative medicine criticise the research evidence for these therapies as relying on too small a study population, or too short an observation period, or for leaving out crucial data.
The notion of evidence-based medicine is increasingly coming under fire – not because medicine shouldn’t be based on evidence, but because of the nature of the evidence we rely on (for some interesting if technical views see here, and here, and here). The gold standard of the randomised double-blind placebo-controlled trial (RCT) misleads in many ways, not the least because its design, which takes large groups of people and looks for what they have in common, furthers the belief that the body is a machine and health is simply a matter of finding a magic bullet that works for all. Very often it the differences – the very inconvenient data that is getting left out of trials – is where the key to understanding health and illness lies.
And the question has to be asked, if the RCT is such an outstanding medium for understanding health and directing our healthcare choices, how is it that so many of them are either ineffective or downright harmful?
The truth is inaccurate data lets us all – those opt for conventional medicine and those who opt for natural medicine – down. It is particularly galling, however, when well-funded, but shoddy conventional science into pharmaceutical and surgical ‘solutions’ to health is held up as representing an inalienable truth at the expense of a broader understanding of health that recognises the individual needs of patients and focuses on prevention instead of ineffective ‘cures’.
Please subscribe me to your newsletter mailing list. I have read the
privacy statement