OK, I confess. I’ve been reading the nutrition news again. I know I shouldn’t. But sometimes the headlines are just too much for me. Like last week: Soy protein isn’t actually good for your heart! Blueberries improve children’s reaction time by 9 percent!! And a peer-reviewed study finds that GMOs are responsible for a shocking array of ailments!!!
Who could resist?
But what happens when you check out the actual studies the headlines are referring to? You learn a little something about nutrition, but a lot more about the state of the news business. We’re not talking fake news. Mostly we’re talking dumb news.
Take the soy protein story, which is the one based on the best science and was reported fairly well by at least some outlets. Here’s the deal: Since 1999, the Food and Drug Administration (FDA) has allowed manufacturers of food products containing at least 25 milligrams per serving of soy protein to make health claims about preventing coronary heart disease (CHD). Earlier this month, the agency announced its intention to rescind the claim. (The notice in the Federal Register is here. It’s both readable and fascinating.)
Why? Well, it seems that the agency had been reviewing the scientific literature on soy protein and heart health: a total of 709 articles, government documents, published letters, book chapters, what have you. When you do a big review like this, you spend a fair amount of time throwing out publications that won’t be useful. For instance, to qualify, a study had to look exclusively at soy protein—and not the other things found in soybeans (such as soy isoflavones, which are used as dietary supplements, and are thought to control cholesterol). It had to use one of FDA’s official “surrogate endpoints” for heart health: total cholesterol, LDL (“bad”) cholesterol, or blood pressure and not other markers like triglycerides, endothelial function, and oxidized low-density lipoprotein.
(A word about surrogate endpoints: What scientist in these studies want to do is figure out the impact of soy protein on getting heart disease. That means that what you’d like to do is sit back and wait until people die, then figure out which ones had CHD. It’s not practical. So researchers turn to measures like cholesterol and blood pressure, which are statistically related to heart disease. It speeds things up, but you get funny anomalies. We’ve long known, for instance, that statins—cholesterol-lowering drugs—have benefits that aren’t explained just by cholesterol. That means that it might be possible to create a drug far more effective than statins at preventing cardiac disease: But how would you test them in a world where the patent clock is always ticking?)
By the time they got done, FDA’s reviewers were down to 212 studies, most of which had serious flaws. The final count: 58 studies of controlled experiments on soy protein and an official, FDA-acknowledged surrogate endpoint. And while a few showed a strong association between heart health and soy protein, most showed no significant relation.
Which doesn’t mean, as a number of news stories pointed out, that soy isn’t good for you. It shows that no one has produced enough evidence yet.
But here’s the interesting part: Remember how we said the studies could only look at soy protein? That meant in part that FDA excluded any study in which soy protein was used to replace something else. Soy milk replacing cow milk? Out. Soy protein replacing meat? Out. The studies were useless to the question at hand because they made it impossible to tell whether the results came from adding soy or replacing, for instance, saturated fat.
Which is, of course, exactly the way most people seem to use the stuff. If you’re like most of us, the soy health claim never told you anything you particularly wanted to know, and if FDA eventually takes it away, it might anger the manufacturers, but it means pretty much nothing to you. Enjoy your Tofurkey. It’s just as good (or not) for you as it ever was.
The soy story points out something that comes up over and over again in looking at scientific and regulatory stories: The people doing the science and making the rules are not necessarily thinking about what we eaters want to know. That point was made abundantly clear in the second story of the week, the amazing brain-enhancing blueberry.
I’ve seen this story reprinted in hundreds of venues (like this), all of them gaga over the astonishing mental improvements brought about by a single dose of blueberries—a 9 percent improvement in reaction time in young children.
Again, the reality is less dramatic. The scientists weren’t trying to make a recommendation about your child’s diet. They were trying to fill in some gaps in our understanding of flavonoids, which are found not just in blueberries, but in a host of foods—tree fruits, nuts, and beans, for example. Flavonoids seem to have all sorts of interesting functions in the body, including mental effects, but the research was unclear about whether they operated the same way on young children, and exactly which mental functions they affected and whether their effect was greater when cognitive demands increased. A team led by Claire Williams of the University of Reading in England set out to find out with a pretty simple experiment.
They got together 21 seven- to ten-year-olds, put them on a low-flavonoid diet for 24 hours, and split them into two groups. One group was fed an orange drink, the other, a drink loaded with frozen blueberry powder. Then they made them play a little video game thought to be linked to the so-called executive function in the brain. They varied the difficulty of the game, threw in some variables like noisy environment versus quiet, and ran the results through several statistical analyses. They concluded that yes, children reacted to flavonoids in about the way they had predicted.
This is all interesting and useful in setting the research agenda, I suppose. But unless you have a flavonoid-starved child around the house, it’s not something you need to pay much attention to, because one question it didn’t answer was the one you probably were curious about: Could you improve the reaction time of a child with normal flavonoid level by feeding them a bunch of blueberries? Maybe you don’t care about that, but the kid does. Super Mario Odyssey is a bear.
We all know that peer-reviewed scientific studies are the gold standard. But when you get down to it, if you’ve got leanings toward bad science, there’s every chance that your true peers lean the same way.
When I received a press release this week announcing, “Peer Reviewed Survey Shows Avoiding GMO’s Improves Health,” I had to take a look. The results included some big numbers: More than 85 percent of people surveyed who had stopped eating foods containing genetically modified organisms said they had seen an improvement in digestive problems, 55 percent said their obesity improved, more than half reported relief from “brain fog.” Twenty-eight conditions made the list. And it was peer-reviewed.
OK, I believe that. But read the study, and what do you find:
First, it’s not a study where actual scientists look at actual physical conditions. It’s a survey—and a really bad one. They started with a database of people who had joined an anti-GMO organization, and sent them a questionnaire asking more or less “How much better are you since quitting GMOs?” (There might be a more biased way to ask that question, but I can’t think of one right now.) So you start with a biased universe (the anti-GMO group) and get the ones who agree with you to self-select by the way you ask the question. Then, ignoring everything we know about placebo effects, confirmation bias, and general wishful thinking, you believe what these people had to say about themselves.
Let’s assume that these people actually got healthier. Are we sure that the change was caused by dropping GMOs? Well, no. To begin with, there’s the question of how they knew they were avoiding GMOs, which are generally unlabeled. Then there are alternative explanations: Three-quarters of respondents said they’d switched to an organic diet. Two-thirds said they reduced processed foods. Almost half gave up sugar-sweetened drinks. The author, to his credit, acknowledges this fact, but doesn’t seem to think it’s a problem. It is.
In case you were wondering, the journal where the piece was published, the International Journal of Human Nutrition and Functional Medicine, is not the New England Journal of Medicine by a long shot. Other recent articles include (presumably peer-reviewed) pieces about chemtrails and why microcephaly in Latin America is caused by Monsanto rather than Zika. The organization behind it, the College of Human Nutrition and Functional Medicine, seems to exist mostly to put on conferences and sell continuing medical education courses. That’s not necessarily an unethical business, but the group’s website doesn’t show any of the hallmarks of a serious journal or professional association.
In short, no responsible publication should pass this story on to its readers as if it’s real science. Not many have to date (it’s only been out of embargo for a couple of days). Most of the pickups so far have been true-believer sites, but I’ve already spotted it on the Shape website, written up by a “digital editorial assistant.” She may be too inexperienced to know better, but she just gave credibility to something that’s basically horseshit. We’ll see if she lures in other overworked, under-edited writers at mainstream sites.
Let’s forget about science and scientists, who mostly know how to assess what they read. The rest of us depend on what we read in newspapers and magazines and online. And it’s often not quite right. This week’s batch of near misses included one case of bad science, the GMO survey, which isn’t likely to make it to the New York Times, but could help confirm some irrational GMO opponents in their beliefs. The soy story was probably the best reported and based on the best science, but it was easy to read the coverage and not realize that the health claim never meant quite what most of us thought it meant, and that FDA’s new analysis answered a question we weren’t asking. The blueberry story was actually the one that bothered me the most. It inspired coverage filled with the wide-eyed (and totally inappropriate) wonder that gets in the way of any kind of reasonable discussion of nutrition. The scientists may not have been hunting for a superfood, but if you only read about their work in the media pickups, you might well come away persuaded that they found one. That doesn’t help.
So what do we do? The best thing, I suppose, would be to dig in a bit. If you see a bit of food news that moves you, hunt down the study and read it. You may not be able to work through all the numbers without help, but at least you can figure out what the researchers were trying to prove. And you can look for signs of sleaze in the site where the research is housed. On a good day—as with FDA’s notice on soy, you can learn a lot about how decisions get made, and what the evidence really says.
Let’s be realistic: At way too many news sites, nutrition stories are garbage stories, assigned to the least experienced writers, written off the press release, basically unedited. If you’re reading this, you’re obviously already better qualified to assess the original research than most of them are. (It’s easy to tell when you hit one who rises above the crowd.) So look at the evidence for yourself.
Or do what I try to do. Don’t read the nutrition news. You’ll save tons of time, and you can spend it cooking something nice. I bet it will even be good for you.