I cannot explain taking things out of context better than John Oliver who on May 08, 2016 on his show, gave examples of how and why media outlets so often report untrue or incomplete study results as scientific fact. Sometimes the press releases issued by researchers contribute to this. Other times it is just bad reporting.  The video had 3,122,256 views (wow!) since being posted on May 08, 2016. He might be a humorist, but he hit the nail right on the head. Here are extracts:

  • Not all studies are equal. Some might be biased by the researchers to ensure original or unusual results to get funding.
  • Study results can be altered by changing how long it lasts, or changing the size of the random sample to making it too small to be reliable.
  • The replication study or studies might not have been done. Too many exploratory studies are taken as fact without the results having been replicated in other studies. (“No Nobel Prize For Fact-checking”.)
  • The study might have used “P-hacking” – Also known as data dredging, data fishing, data snooping, and equation fitting is the use of data mining to uncover patterns in data that can be presented as statistically significant, without first devising a specific hypothesis as to the underlying causality.
  • The researcher might present the results wrongly or just be plain unreliable, or the organization that funds the study has a vested interest in the results one way or another. (Such as Coco-Cola funding a study that supports the benefits of hydration – there’s potential for conflict of interest.)
  • Exploratory studies are taken for fact without a replication study, or peer review, or before taking into account all the other studies done in the field.
  • Studies are incorrectly attributed to a different, wider subject base (for instance, results of tests on mice attributed to women). Trials on non-human subjects have to be replicated on humans in order to be valid.
  • When reporting studies in the media, findings are often watered down and simplified, and in the process, interpreted wrongly with essential details (such as who funded it) being lost.
  • Science is imperfect, but hugely important. So both scientists and reporters need to avoid these mistakes.
  • Do not go with “the study best suited to you” – do not cherry pick from the results; Science is not à la carte.

Back to the main article.

 

 

%d bloggers like this: