Lies, Damn Lies and Doctors who are bad at maths

A wonderful article about the misunderstanding of statistics not just by patients but by huge numbers of practicing doctors.

I was particularly interested in the explanation of Survival rates as opposed to Mortality rates which shows that, despite common belief to the contrary, about the same percentage of men die of Prostate Cancer in the UK as in the US.

Last edited by

11 Replies

  • Statistics must be the most misunderstood branch of mathematics that is used daily by most people but I think that may be because it has been subject to the worst of political, financial and every-day abuse by the media.

    If any article gives simple percentages to back up a point of view without going into details about how they were arrived at then you can dismiss them as unproven or even made up. I have used this as a guide to myself over the last decade or so and it works fine. The problem is that it takes a long time to dig into the trail of information leading to a scientific or mathematical paper and sometimes it costs a fortune to get access to that paper. Nothing is easy even in the age of the Internet.

  • We've a guide to making sense of statistics. It provides the questions to ask ("If a statistic is the answer, what was the question?") and identifies the pitfalls to avoid to help us get behind news stories that use statistics:

  • Misuse of statistics is only half of the problem. Framing the research question to deliver desirable results is a source of distortion; selecting a sample population that excludes certain groups, having short follow-up times, 'cleaning the data' of outliers............

    Doctors don't have time to examine actual trial data, even if it is all disclosed, they rarely have time to read beyond the manufacturer's marketing piece that positions good data to the fore.

  • This 'wonderful' article is entertaining, provocative and 'good copy' but misleading. If someone purports to have the appropriate data to find the prior probability (1%), the sensitivity (90%) and the specificity (91% and thus false alarm rate of 9%) of a test result, then they should simply count the number of patients with the diagnosis in those with the test result in that population (the 'predictiveness'). There is no need to apply Bayes theorem to 'calculate' that it is 10%. The trouble is that the sensitivity and specificity can look impressive (like relative risk) and as the prevalence depends on the population of patients you choose to measure the prevalence, it is easy to use Bayes theorem to calculate any level of 'predictiveness' to surprise or impress your audience.

  • @Drcommonsense I would say most patients find the concept of probability easier to understand than "predictiveness" when having results explained? Nothing misleading about saying "your test result means there is a 10% chance you have the disease". I thought it was a nice article!

  • I wrote an article in the Guardian a few years ago, on a similar theme. The Government were trying to justify their healthcare reforms on some very dubious statistical claims.

  • Good example of a spelling mistake spreading like a virus. If you search for 'prostrate cancer' there are currently 88 links. If you look up 'prostate cancer' the results list completely different articles. People could miss valuable information here. Depressing.

  • Or in my case, I assume the autocorrect on my tablet doing its thing, as normal.

    Thankfully, if you search on Google, it corrects you anyway.

    However, I just noticed that my post was edited by someone called Joel - who is that? Maybe it wasn't my autocorrect after all.

  • Hi Joss

    The BBC link stopped working so the HealthUnlocked guys fixed it. Thanks! Emily

  • Thanks Emily - Can you bung them a message asking them to put a note at the bottom of the post, like "Edit: fixed broken link?"

    Its always useful for people to know why something was changed. :)

  • Will do, thanks Joss.

You may also like...