What we know (and what we don't) about the report that states that 1,200 die in Spain every year from pseudotherapies

There are times when the need to measure the negative impact of things with problems coexists in order to do it rigorously. This is, without a doubt, the case of pseudosciences. The best sample is the "First report on deaths due to pseudotherapies in Spain" that has been presented by the Association to Protect the Patient from Pseudoscientific Therapies (APETP). Hence the figures of between 1,210 and 1,460 of deaths from pseudotherapies that we are watching in the media.

In the world as dark, misleading, and murky as that of pseudosciences, the report provides a highly accessible summary of the main health problems caused by pseudo-therapies, and also tracks the general press and scientific literature to compile a rosary of cases that make it clear that the problem of pseudosciences goes beyond the anecdotal.

However, the report also has serious problems. Above all, in the estimation of deceased. And it's not that the authors don't know it. Quite the contrary, they are fully aware of the limitations of the work (and leave it in writing repeatedly). It seems that the search for a number to bring to the headlines has led them to specify numbers that may be plausible, but we do not know if they are true.

A necessary report ...

If we read the report, it is clear to the authors that "this report is intended to be a first step towards a more in-depth study "and which simply tries to fill in the gaps that the public authorities do not want to fill: the real impact of pseudo-therapies are taking place in the country.

The idea behind the report is that we need to quantify that impact in order to stop talking seriously about fighting pseudosciences. In this sense, the report develops an interesting taxonomy of adverse effects: three closely related to health (abandonment of treatment, delay of treatment and direct damages from the effects of pseudotherapy) and two not related to it (such as economic deception). and creating false hopes).

The report analyzes the first three causes of adverse effects and states that "it can be concluded that the current number of deaths from these practices in Spain, very possibly, is higher than the one thousand deaths." Specifically, between 550 and 800 deaths due to abandonment or delay of therapies and another 660 due to direct damage. Between 1210 and 1460 deaths in total.

We must acknowledge the work done by the authors of the work and, above all, their intellectual honesty by repeatedly pointing out in the report that the data is based on highly problematic estimates. It must also be recognized that (since the estimates focus on just two health problems) the figures are reasonable, even very conservative. What's more, the report is the best we can do with the information we have, but in all fairness, we must take the fork wisely and skeptically.

... but before which we must be skeptical

In my view, it is a mistake to read the report as an estimate of deaths due to pseudo-therapies in the strict sense. It is rather a wake-up call to the lack of information that exists on the matter. However, the chosen format and its communication may give rise to misunderstandings: I do not think that anything can be "concluded" from it. Lack of research greatly limits the scope of the report despite the authors' efforts.

The figures for 550 cancer deaths arise from extrapolation from a 2003 Norwegian study with a sample of just over 500 patients. This is clearly not a bad decision because, at worst, improvements in treatments over the past 15 years should have widened those differences. But still, using those figures in the current Spanish context is not easy.

Something similar happens with mortality from direct damage. The report uses a similar Canadian sample study to estimate a number of 660 strokes deaths. And despite acknowledging that causality is not proven, that Canada is one of the countries with the most penetration of this type of practice and that in Spain the figures must be reasonably lower than those used in the calculation, they take it as an estimate for the global computation.

In general, this problem is repeated with all the estimates: the figures are taken as good without having any solid reason to do so and without correcting the possible errors that those decisions entail. The worst thing is that, as I say, they are meaningful estimates (estimates that probably fall short), but without solid arguments on which to base them we cannot use them as a reference.

I do not want to give the impression that the report is not valuable: the theoretical, methodological and organizational difficulties of preparing a work of this magnitude are enormous and it is commendable that civil society leads projects of this type when neither the universities nor the public administrations they decide to do it.

These data are the best we can have at the moment, that's true. And it is good to have them at hand and to have a global perspective of the matter (however imprecise it may be). However, as much as possible, there is much to improve if you want to become a useful tool that goes beyond a media campaign.

Share none:  Analysis Mobile Our-Selection 

Interesting Articles