Trying to make decisions about your health based on what you've seen in the media can be a very tricky business.
Often contradictory claims can appear in the same paper on different days of the week.
If you want a quick tips on how to better understand health news, read Dr Alicia White's 8-point guide to spotting a red herring among the health headlines: nhs.uk/news/Pages/Howtoread...
Let us know if there any more points that should be included.
Also have a read of David Carroll's blogs - 'Health in the media: the good, the bad and the ugly'! where he discusses the real evidence behind the headlines: students4bestevidence.net/c...
That's a great resource - I love the "The Good, The Bad & The Ugly" take on it, as it's actually celebrating the genuinely good science/health journalism. I hope this is becomes required reading!
I like the comment: "a quick check for a questionable article would be "Am I reading the Daily Mail"..." - fair point. Although when I was scouring the papers' health articles, while DM came out on top for awfulness, they do have a reasonable number of good stories, too.
It was The Times that I found to be consistent in a more skeptical and accurate reporting approach.
What I found to be there worst stories were usually "celebrity tries new thing - reporter tries it, gives potentially dangerous product/company a plug without discussing risks".
"Often contradictory claims can appear in the same paper on different days of the week."
But more often than not, the same nonsense is simply regurgitated by other papers (maybe the smaller ones looking for copy to fill their pages?). This may simply reinforce the public's perception that it's valid and settled - after all, they've read the same thing in several papers...
Yes "Me too" stories abound in the papers unfortunately. Presumably at the request of editors saying "Why didn't we get this story?", or as you say, simply to fill space quickly and easily. Time and again the "biggest" (ie. most widespread) news stories we find each day are largely cut and paste from a press release. This may partly explain why so many of the errors are repeated across the media.
I think Sense About Science believe that celebrities (or coverage involving celebs) are getting better - seee their 2013 round-up for celebs and science senseaboutscience.org/resou....
My sense is that celebs - having been built up and knocked down - are yesterday's news. Now it appears the trend is for more "real stories" of either ordinary folk's experiences, or journalists themselves trying treatments or recounting experiences (this coverage, by The Mirror, of "vampire facials" is a case in point mirror.co.uk/lifestyle/vamp....
I think broadly speaking, there are two types of health news. I think Alicia's guide focuses mainly on the sort that comes from genuine research which may be promising and of a high quality, but could be over-egged, sometimes by the press release itself, often by newspapers wishing to make it seem as straightforward and as eye-catching as possible, particularly in their headlines.
Then there's the lifestyle health/beauty type pieces, originating perhaps from a quacky local business or from a company selling or producing some kind of health/beauty product, diet regime or whatever. The original press releases often contain misleading claims and celebrity name-drops and may then be regurgitated unquestioningly, giving them a free and misleading ad outside the remit of the ASA. These kind of stories tend to appear not in the news section, but in lifestyle sections, where I think reporters and editors may be be less vigilant against nonsense, or even sometimes encouraging of it, since the really daft stuff is more likely to get noticed and commented upon. For example anti-ageing snail slime face gel in the Daily Mail.
I think the ones Noodlemaz mentions ("celebrity tries new thing - reporter tries it, gives potentially dangerous product/company a plug without discussing risks") fall into the latter category. In such cases, where you might get not get far by complaining to the paper, and where linking to it may only encourage them, there is something you can do - find the website of the company being pimped and check it for misleading claims which you could then report to the ASA.
What annoys me is the whole “its official” meme. Just because one person with a white coat and some letters after their name offers a hypothesis, does not make it official.
Researchers opinions are often taken as proven fact in a way that would never occur in say, social policy or economics.
From what I can tell the only thing in EBM that deserves “its official” status is that smoking is bad for you.
It could be the case that journalists are intimidated by scientists as they often have a crappy humanities degree and no training in science and give them more respect than they deserve. (Disclaimer – I am a journalist with a crappy humanities degree).
We agree that evidence based medicine (EBM) isn't one scientist's opinion. But EBM has a system for assessing the quality of research before it's published. Peer review. Peer review means that other scientific experts in the field check research papers for validity, significance and originality. Editors of scientific journals draw on a large pool of suitable experts to scrutinise papers before deciding whether to publish them.
Many of the research claims you read in newspapers are not published in a peer-reviewed journal. Unpublished research is no help to anyone. Scientists can’t repeat or use it and as a society we can’t base decisions about our public safety – or our family’s health for example – on work that has a high chance of being flawed.
So we'd encourage everyone to ask, 'Is it peer reviewed?' If not, why not? We know that peer review can break down, but just as a washing machine has a quality kite-mark, so peer review is a kind of quality mark for science. It tells you that the research has been conducted and presented to a standard that other scientists accept.
Absolutely, and within EBM it is important to state how good the evidence is. Which is something the Cochrane Collaboration try to do with all their systematic reviews of health evidence.
There is a genuine problem for most people with most articles in that it is often impossible or difficult to check any of the points on your How To list. Even with some of the slightly less sensationalist reporting on the BBC website, there are not always links to original research (and much of that needs a subscription to read anyway), there is never a reference to the original press release or where it came from and for brevity's sake, any quotes are taken from source material rather than the result of fresh reporting.
This last point is often the most important. Press releases on health matters are often not the best worded or constructed bits of information your will encounter. PR people can be equally or more sensationalist than newspapers, and scientists and researchers are not always aware that many people (including journalists) read the bulk of an article while clinging to the apparent claims of the headline (assuming they get past the headline at all.)
So, if you write "New Therapy To Cure All Cancers" as a headline and then later in the press release add "So, in 20 years time we will see the beginning of a revolution on cancer treatment thanks to this exciting research..." do not be surprised if any resulting press articles will revolve around your headline and leave out that all important rider further down the page.
Even if the press do add the rider, the public may not read that far.
It is vital that the scientific establishment take control of the release of this information in the same way as perfume company would do. When you read a claim in the Daily Mail about a new miracle make-up, it is often word for word what the company would want to get across - how many times does that happen with medical research?
Keeping control of the information means that you can answer direct questions, avoid ambiguities and get the reporter on side. In addition, for the reader (where this thread started), they get a feel that there is good science behind the story, even if the detail of that science and the statistics is beyond them, and they are in a better position to make a judgement.
You cannot stop the publication of bad science, but you can drown it out by making the good science more palatable, more exciting and more prevalent, while ensuring that the "potential" benefits do not outweigh the more likely reality.
Well, we've done a rather tongue-in-cheek version of that for our Christmas special (nhs.uk/news/2013/12December..., but it's a good point and we should do a sensible version. Who do you think we should write it for? Researchers? PRs?
heh! I think it could work on several levels - if you really wanted to go for it.
Starting at the beginning, maybe a guide on what is the difference between a report for the Lancet and something for your average person in the street. Science stories have a starting point and that is the researchers. If they get their initial communication wrong, then everything from that point on can go astray. This can even apply to the official paper - I cant count the number of times I have heard one learned fellow say about another's paper "I think what they are concluding is...." That is not helpful!
Small pointers like the pitfalls of "assumed knowledge" and hugely long sentences where they subject and object get confused and, most importantly, starting the very first sentence with the conclusion, not the ambition. Oh, and not being overly optimistic about the future of your wonderful discovery. Be pragmatic.
Next level would be the PR people. Some of the same rules apply here, but PR people rely on the short phrase and the catchy headline to get their story across. They will also often rely on the more exciting possibility than the most likely. It is the weather forecasters mistake - "40% chance of a hurricane!" sounds a lot better than "60% chance of a mild day"
So, what should be in the press release could be addressed as well as guidance about how to get some sort of rational sense out of the researcher in the first place.
Lastly would be a guide for the media - how to get the best, most exciting story (with the potential of followups) and yet still be accurate.
Basically there are three main areas where the story can go wrong - intentionally or unintentionally - even before you get to the point where the public completely miss the point!
HI Emily - that is a useful resource that should be sent out to all health and science journalists monthly (just in case they forget)
However, never let it be said that a journalist or editor will let sensible guidelines get in the way of telling a story their way.
But just as the public can be manipulated (sorry, "unintentionally mislead") by how a story is written up, so a journalist can be, umm, influenced by how the source material is presented. So, getting the dissemination of the event done right in the first place increases the chance of your very good guidelines getting followed.
Emily, just reading that PDF, where it talks about health risks, this is an area that not only confuses the public, it confuses journalists too - it is amazing how little percentages are understood. They do not understand what absolute risk is opposed to relative and so on. And where you see journalists trying to say the right thing, they often make it even more confusing.
Maybe these guidelines should suggest some nice, non-mathematical, but clear way of stating these figures. "Cupcakes Increase Likelyhood of Cancer by ONE PERCENT" is the headline. The Lead line should be "People who eat cupcakes regularly increase the likely hood of cancer from one percent to two percent." And then make sure you never say double in the press release anywhere.
(If that word is in the press release or the research paper summary, you can guarantee it will be in the news item)
It's almost hopeless: if journalists don't understand percentage points, how are they going to ever understand a decent level of stats such as that required to gain some understanding from the results or even the conclusion! Or science.
That's probably not fair and tars all journos with the same brush - we do know there are good science journos out there who do know - it's the ones who don't that need to be educated or dissuaded from writing about things outside their area of expertise (whatever that might be). And the ones we all spend far too much time trying to correct and undo the misinformation and damage they cause.
This was a food writer and a chef who mentioned in passing:
"Watercress also contains significant levels of glucosinolate compounds and many studies now suggest that these have anti-cancer effects. Eating these compounds appears to help inhibit breast, lung, colon, and prostate cancers."
How can journos and newspapers be educated so that they know to avoid dangerous nonsense like this? Or is that a forlorn hope?
To be honest, it is not so much about the journalists or whether they are versed in the maths of probably outcomes or not. The point is that information needs to be accessible to any who would benefit from it and it is pointless putting out something that is potentially misleading and then saying "well, you should have listened better in school."
This is why advertisers always win - they start from the premise that they need to sell to all and any that maybe interested and so need to work at all levels without being judgemental.
I often think there is a fine line between "dumbing down" and "getting of a high horse"
As for simple misinformation, well, you will never stop it. But if you can make the good stuff louder, easier and more interesting, then the bad stuff stands a better chance of getting drowned out.
"The point is that information needs to be accessible to any who would benefit from it and it is pointless putting out something that is potentially misleading and then saying "well, you should have listened better in school.""
I'm not sure I understand: information should certainly be accessible, but if journos can't interpret studies properly, then what they write is unlikely to be correct.
But I agree entirely with what you say about making the good stuff louder - but how can that be done?
Some of the best journalists (those that connect with their readers best) are those that are least technical - probably why they have a wide appeal. But they may not be the best people to interpret studies; especially the ones that are presented in a very technical, dry (and sometimes boring) way.
But these are the very people you want on board. So I think there needs to be two reports written - one you get published in your favourite peer reviewed journal, and the second is aimed directly between the eyes of those more public facing publications.
If you want to sell something (idea as well as product) to a busy Frenchman, it would be wise to sell it to them in French, not hope they happen to speak your language.
And this applies to making the good stuff louder. However nice and wonderful many science correspondents are, they don't often have the largest following. So, you have an important health message, go to the journalist that does.
"So I think there needs to be two reports written - one you get published in your favourite peer reviewed journal, and the second is aimed directly between the eyes of those more public facing publications. "
Currently, isn't that partly done by PR departments of universities, etc in their press releases? There are numerous examples of them completely misunderstanding/misrepresenting the research - possibly for PR purposes.
But aren't there two issues here? One is the misrepresentation of the latest ot-off-the-presses research paper and the other is how that fits in with previous research and current knowledge. We frequently get told about amazing new results, but ones that go against established knowledge. But what matters is usually what the synthesis of all evidence says. But that may not make such 'attractive' headlines!
Talking of finding good, accessible info on health research- have you heard of the Elf blogs? They are really neat blogs that share new research. There's the Mental Health Elf: thementalelf.net/.
And several others including The Dental Elf, The Diabetes Elf, The Education Elf, The Learning Disabilities Elf, The Lifestyle Elf The Musculoskeletal Elf: thementalelf.net/about/#sth...
Hi Emily, Thank you for those kind words about the elves!
This is a great thread and it's marvellous to see so many people from varied backgrounds getting involved in the discussion. I'm new to this Healthy Evidence malarkey, but really like what I see so far!
We started the Mental Elf in 2011 to help people keep up to date with the latest reliable mental health research. There's no shortage of mental health websites out there, but there are actually very few that provide *reliable* information in a format that is *usable* and *engaging*. For me, that's the key combination that we need to be aiming for.
People have very little time and often don't want to read the full research papers, even if they can access them without paying! So we provide short, concise summaries of new evidence, in a format that is easy to understand.
Our 8 current sites are all aimed at health and social care professionals, but they're open access and free to use so we find they get read loads by the general public too:
We are an open network and always on the look out for new people to blog on the National Elf Service, so have a read of this blog and drop me a line if you fancy donning an elf hat from time to time: thementalelf.net/mental-hea...
I think it's difficult when it's so pervasive. Try reading a cookery magazine or watching cookery programmes on daytime TV and see how you get on. Although the example you give is at the extreme end, I get the impression that misleading health claims for foods are absolutely rife. Typically, there's that old misunderstanding much beloved of supplement sellers that seems to assume we're all deficient in all sorts of trace elements and vitamins. A particular food containing a particular trace element is presented as being "good for" whatever aspect of health it may play a role in - as though the more you have, the better. This situation is exacerbated by the fact that nutritionists are still often treated as though they have some expertise, for example writing regular features in magazines and lifestyle supplements, and appearing on daytime shows. Far from being expert, many nutritionists have been misled as part of their training, labour under misunderstandings and make misleading claims. And of course even dead cats can become nutritionists.
You sometimes also have to ask - how current is the information? For example I still see pieces on caesarean sections which quote the "WHO advice" that no country should have a C/S rate above 15% (The UK rate is about 24%). The WHO withdrew this recommendation in 2010 as there was no evidence to back it up.
Also, not a news story, but I know of a very popular book of advice for parents which makes lots a lot of claims about evidence and how those who give different advice don't have any evidence. Yet the books reference section contains only one peer reviewed article and that is from the 1920's. I seriously question it's relevance today.
Is there any kind of resource that ranks journals for impact and reliability? I'm familiar with those in my field but if a news story references a paper in say a psychology journal I have no idea how much trust to put in it. I've seen some such "references" which seem to actually just be other magazine articles, all be it written by people with PhDs.
This is a whole minefield in itself. The ideology of the impact factor has skewed what we consider real and impactful science. Many would consider a total movement away from impact factors and towards altmetrics or article level metrics. This would at least give more idea whether an individual paper is useful to the field even if it's not published in a high ranked journal.
Thanks for the replies - Impact was probably the wrong word to pick - What I'm after is something very simple and easily accessible to those without a science background.
For example, several times I've seen articles/websites etc claim there is "scientific evidence" for something and give a link to prove it. The link goes to an article in Psychology Today. It's written by someone with a PhD, has references and the publication sounds "sciency" so It's understandable that non-scientists could confuse it for something in a proper peer reviewed journal. What's the simplest, quickest way to show people the difference?
I do have a science background and always try to dig out the papers involved for anything I write about/act on, but if it's something well out of my professional field I like some vague idea of how important the journal is considered. Do you think that is a bad idea? Opinions very welcome!
There isn't anything that will inform the reader about how reliable the source is but there are community projects such as sciencegist.com/ which are trying to bring science papers to a wider audience. I suppose the more people that interact with this sort of platform, the more informative it could be regarding reliability.
Content on HealthUnlocked does not replace the relationship between you and doctors or other healthcare professionals nor the advice you receive from them.
Never delay seeking advice or dialling emergency services because of something that you have read on HealthUnlocked.