How can we educate journalists to conside... - Healthy Evidence

Healthy Evidence

3,061 members439 posts

How can we educate journalists to consider positive predictive values when reporting new diagnostic tests? A recent case in point...

robdavies profile image
4 Replies

bbc.co.uk/news/health-28198510

nhs.uk/news/2014/07July/Pag...

dcscience.net/?p=6473

Written by
robdavies profile image
robdavies
To view profiles and participate in discussions please or .
4 Replies
LPhipps profile image
LPhipps

Hello

I know that the research team and the King's press office went to great lengths to try to avoid this study being misreported as a potential population screening tool. They arranged a Science Media Centre background briefing for journalists to try to explain more about the context of the study and the caveats - mainly that it isn't being designed to screen the population but to aid the development of clinical trials in particular groups of patients.

We at ARUK did link to an illustrated version of DCs blog on this topic in our blog for supporters about the story. I think the point was picked up by journalists following the SMC briefing, more than it has been on similar stories in the past, and I did a number of radio interviews where the presenter said 'but the risk of misdiagnosis from this test is still high, right?'. I think this is a sign that the point wasn't lost on everyone. Chris Smyth at the Times carried a quote 'He said that an early warning test would be very important in the future, but added: “If you do a test on 1,000 people of whom 100 are truly going to [have Alzheimer’s] you’ll end up with 90 correct diagnoses and 90 incorrect diagnoses. So we do have to be very, very careful in the way that we think about using this test.”

robdavies profile image
robdavies in reply to LPhipps

Thanks LPhipps. Thanks very interesting to hear, I wasn't aware such efforts were underway and from your description may already be changing mindsets in certain pockets of the media. Great stuff.

Emily_Jesper profile image
Emily_JesperPartnerSense About Science

Screening test figures are explained on page 11 of our guide, Making Sense of Screening: senseaboutscience.org/data/...

Gez_Blair profile image
Gez_Blair

when I first started writing about EBM it took quite a while to grasp the concept that a test with a 90% accuracy rate could be worst than useless depending on the prevalence rate of the condition being tested for.

It's just such a counter-intuitive concept. You see the figure 99% and your brain automatically thinks "90%! That must be good"

So I understand why journos, the public, and in some cases clinicians, struggle with the concept

You may also like...

MP calls for astrology to be made available on the NHS

http://www.bbc.co.uk/news/uk-politics-28464009 Its like the Enlightenment / Renaissance never...

New guidelines on alcohol published

http://www.nhs.uk/news/2016/01January/Pages/New-alcohol-advice-issued.aspx Nanny state nonsense or...

Vitamin C versus The Big C

BBC News is today suggesting (in it's headline at least) http://www.bbc.co.uk/news/health-26038460...

Lies, Damn Lies and Doctors who are bad at maths

huge numbers of practicing doctors. http://www.bbc.co.uk/news/magazine-28166019 I was particularly

Are vitamin D pills 'pointless'?

In the news today - http://www.bbc.co.uk/news/health-24473156 - a New Zealand research team...