The 2015 Dietary Advisory Committee Report: A Summary

Last week, the Dietary Guidelines Advisory Committee released the report containing its recommendations for the 2015 Dietary Guidelines for Americans.   The report is 572 pages long, more than 100 pages longer than the last report, released 5 years ago.  Longer than one of my blog posts, even. Despite its length and the tortured governmentalese in which it is written, its message is pretty clear and simple. So for those of you who would like to know what the report says, but don’t want to read the whole damn thing, I present, below, its essence:

Dear America,

You are sick–and fat.  And it’s all your fault. 

Face it.  You screwed up.  Somewhere in the past few decades, you started eating too much food. Too much BAD food.  We don’t know why.  We think it is because you are stupid.

We don’t know why you are stupid.

You used to be smart–at least about food–but somewhere in the late 1970s or early 1980s, you got stupid. Before then, we didn’t have to tell you what to eat.  Somehow, you just knew. You ate food, and you didn’t get fat and sick.

But NOW, every five years we have to get together and rack our brains to try and figure out a way to tell you how to eat–AGAIN.  Because no matter what we tell you, it doesn’t work. 

The more we tell you how to eat, the worse your eating habits get. And the worse your eating habits get, the fatter and sicker you are.  And the fatter and sicker you are, the more we have to tell you how to eat. 

DGA - Length & Obesity 1980-2010

Look. You know we have no real way to measure your eating habits.  Mostly because fat people lie about what they eat and most of you are now, technically speaking, fat.  But we still know that your eating habits have gotten worse. How?  Because you’re fat.  And, y’know, sick.  And the only real reason people get fat and sick is because they have poor eating habits.  That much we do know for sure.

And because, for decades now,  we have been telling you exactly what to eat so you don’t get fat and sick, we also know the only real reason people have poor eating habits is because they are stupid.  So you must be stupid.

Let’s make this as clear as possible for you:

sick fat stupid people

And though it makes our hearts heavy to say this, unfortunately, and through no fault of their own, people who don’t have much money are particularly stupid.  We know this because they are sicker than people who have money.  Of course, money has nothing to do with whether or not you are sick.  It’s the food, stupid.

We’ll admit that some of the responsibility for this rests on our shoulders.  When we started out telling you how to eat, we didn’t realize how stupid you were.  That was our fault.

In 1977, a bunch of us got together to figure out how to make sure you would not get fat and sick.  You weren’t fat and sick at the time, so we knew you needed our help.

We told you to eat more carbohydrates–a.k.a., sugars and starches–and less sugar.  How simple is that?  But could you follow this advice?  Nooooooo.  You’re too stupid.

We told you to eat food with less fat. We meant for you to buy a copy of the Moosewood Cookbook and eat kale and lentils and quinoa.  But no, you were too stupid for that too.  Instead, you started eating PRODUCTS  that said “low-fat” and “fat-free.”  What were you thinking?

We told you to eat less animal fat. Obviously, we meant JUST DON’T EAT ANIMALS.  But you didn’t get it.  Instead, you quit eating cows and started eating chickens.  Hellooooo?  Chickens are ANIMALS.

After more than three decades of us telling you how to eat, it is obvious you are too stupid to figure out how to eat.  So we are here to make it perfectly clear, once and for all.

FIRST:  Don’t eat food with salt in it.

Even though food with salt in it doesn’t make you fat, it does raise your blood pressure.  Maybe.  Sometimes.  And, yes, we know that your blood pressure has been going down for a few decades now, but it isn’t because you are eating less salt because you’re not.  And it’s true that we really have no idea whether or not reducing your intake of salt prevents disease. But all of that is beside the point.

Here’s the deal:  Salt makes food taste good.  And when food tastes good, you eat it.  We’re opposed to that.  But since you are too stupid to actually stop eating food, we are going to insist that food manufacturers stop putting salt in their products.  That way, their products will grow weird microorganisms and spoil rapidly–and will taste like poop.

This will force everyone to stop eating food products and get kale from the farmer’s market (NO SALT ADDED) and lentils and quinoa in bulk from the food co-op (NO SALT ADDED).  Got it?

Also, we are working on ways to make salt shakers illegal. 

Ban Salt Shakers

 

NEXT:  Don’t eat animals. At all.  EVER.

We told you not to eat animals because meat has lots of fat, and fat makes you fat.  Then you just started eating skinny animals. So we’re scrapping the whole fat thing.  Eat all the fat you want.  Just don’t eat fat from animals, because that is the same thing as eating animals, stupid.

We told you not to eat animals because meat has lots of cholesterol, and dietary cholesterol makes your blood cholesterol go up.  Now our cardiologist friends who work for pharmaceutical companies and our buds over at the American Heart Association have told us that avoiding dietary cholesterol won’t actually make your blood cholesterol go down.  They say:  If you want your blood cholesterol to go down, take a statin.  Statins, in case you are wondering, are not made from animals so you can have all you want.  

Eggs? you ask.  We’ve ditched the cholesterol limits, so now you think you can eat eggs?  Helloooo?  Eggs are just baby chickens and baby chickens are animals and you are NOT ALLOWED TO EAT ANIMALS.  Geez.

Yes, we are still hanging onto that “don’t eat animals because of saturated fat” thing, but we know it can’t last forever since we can’t actually prove that saturated fat is the evil dietary villain we’ve been saying it is.  So …

Here’s the deal:  Eating animals doesn’t just kill animals.  It kills the planet.  If you keep killing animals and eating them WE ARE ALL GOING TO DIE.  And it’s going to be your fault, stupid.

And especially don’t eat red meat.  C’mon.  Do we have to spell this out for you?  RED meat? 

RED meat = COMMUNIST meat.  Does Vladimir Putin look like a vegan?  We thought not. 

 

 If you really must eat dead rotting flesh, we think it is okay to eat dead rotting fish flesh, as long as it is from salmon raised on ecologically sustainable fish farms by friendly people with college educations. 

FINALLY:  Stop eating–and drinking–sugar.

Okay, we know we told you to eat more carbohydrate food.  And, yes, we know sugar is a carbohydrate. But did you really think we were telling you to eat more sugar?  Look, if you must have sugar, eat some starchy grains and cereals. The only difference between sugar and starch is about 15 minutes in your digestive tract.  But …

Here’s the deal:  Sugar makes food taste good.  And when food tastes good, you eat it.  Like we said, we’re opposed to that.  But since you are too stupid to actually stop eating food, we are going to insist that food manufacturers stop putting sugar in their products.  That way, their products will grow weird microorganisms and spoil rapidly–and will taste like poop.

This will force everyone to stop eating food products and get kale from the farmer’s market (NO SUGAR ADDED) and lentils and quinoa in bulk from the food co-op (NO SUGAR ADDED).  Got it?

Ban cupcakes

 

Hey, we know what you’re thinking.  You’re thinking “Oh, I’ll just use artificial sweeteners instead of sugar.”  Oh NOOOO you don’t.  No sugar-filled soda.  No diet soda.  Water only. Capiche?

 So, to spell it all out for you once and for all:

DO NOT EAT food that has salt or sugar in it, i.e. food that tastes good.  Also, don’t eat animals.

DO EAT kale from your local farmers’ market, lentils and quinoa from your local food co-op,  plus salmon. Drink water.  That’s it. 

And, since we graciously recognize the diversity of this great nation, we must remind you that you can adapt the above dietary pattern to meet your own health needs, dietary preferences, and cultural traditions. Just as long as you don’t add salt, sugar, or dead animals.

Because we have absolutely zero faith you are smart enough to follow even this simple advice, we are asking for additional research to be done on your child-raising habits (Do you let your children eat food that tastes good?  BAAAAD parent!) and your sleep habits (Do you dream about cheeseburgers?  We KNOW you do and that must stop!  No DEAD IMAGINARY ANIMALS!)

And–because we recognize your deeply ingrained stupidity when it comes to all things food, and because we know that food is the only thing that really matters when it comes to health, we are proposing  America create a national “culture of health” where healthy lifestyles are easier to achieve and normative.

“Normative” is a big fancy word that means if you eat what we tell you to eat, you are a good person and if you eat food that tastes good, you are a bad person. We will know you are a bad person because you will be sick. Or fat. Because that’s what happens to bad people who eat bad food.

We will kick-off this “culture of health” by creating an Office of Dietary Wisdom that will make the healthy choice–kale, lentils, quinoa, salmon, and water–the easy choice for all you stupid Americans.  We will establish a Food Czar to run the Office of Dietary Wisdom because nothing says “America, home of freedom and democracy” like the title of a 19th-century Russian monarch.*

The primary goal of the “culture of health” will be to enforce your right to eat what we’ve determined is good for you. 

This approach will combine the draconian government overreach we all love with the lack of improvements we expect, resulting in a continued demand for our services as the only people smart enough to tell the stupid people how to eat.**

 Look.  We know we’ve been a little unclear in the past.  And we know we’ve reversed our position on a number of things. Hey, our bad.  And when, five years from now, you stupid Americans are as sick and fat as ever, we may have to change up our advice again based, y’know, on whatever evidence we can find that supports the conclusions we’ve already reached.

But rest assured America.

No matter what the evidence says, we are never ever going to tell you it’s okay to eat salt, sugar, or animals.  And, no matter what the evidence says, we are never ever going to tell you that it’s not okay to eat grains, cereals, or vegetable oils.  And you can take that to the bank.  We did.

Love and kisses,

Committee for Government Approved Information on Nutrition (Code name: G.A.I.N.)

***********************************************************************************

*Thank you, Steve Wiley.

**Thank you, Jon Stewart, for at least part of this line.

 

As the Calories Churn (Episode 3): The Blame Game

In the previous episode of As the Calories Churn, we explored the differences in food supply/consumption between America in 1970 and America in 2010.

We learned that there were some significant changes in those 40 years. We saw dramatic increases in vegetable oils, grain products, and poultry—the things that the 1977 Dietary Goals and the 1980 Dietary Guidelines told us to increase. We saw decreases in red meat, eggs, butter, and full-fat milk—things that our national dietary recommendations told us to decrease. Mysteriously, what didn’t seem to increase much—or at all—were SoFAS (meaning “Solid Fats and Added Sugars”) which, as far as the 2010 Dietary Guidelines for Americans are concerned, are the primary culprits behind our current health crisis. (“Solid Fats” are a linguistic sleight-of-hand that lumps saturated fat from natural animal sources in with processed partially-hydrogenated vegetables oils and margarines that contain transfats; SoFAS takes the trick a step further, by being not only a dreadful acronym in terms of implying that poor health is caused by sitting on our “sofas,” but by creating an umbrella term for foods that have little in common in terms of structure, biological function or nutrition.)

Around the late 70s or early 80s, there were sudden and rapid changes in America’s food supply and food choices and similar sudden and rapid changes in our health. How these two phenomena are related remains a matter of debate. It doesn’t matter if you’re Marion Nestle and you think the problem is calories or if you’re Gary Taubes and you think the problem is carbohydrate—both of those things increased in our food supply. (Whether or not the problem is fat is an open debate; food availability data points to an increase in added fats and oil, the majority of which are, ironically enough, the “healthy” monounsaturated kind; consumption data points to a leveling off of overall fat intake and a decrease in saturated fat—not a discrepancy I can solve here.) What seems to continue to mystify people is why this changed occurred so rapidly at this specific point in our food and health history.

Personally responsible or helplessly victimized?

At one time, it was commonly thought that obesity was a matter of personal responsibility and that our collective sense of willpower took a nosedive in the 80s, but nobody could ever explain quite why. (Perhaps a giant funk swept over the nation after The Muppet Show got cancelled, and we all collectively decided to console ourselves with Little Debbie Snack Cakes and Nickelodeon?) But because this approach is essentially industry-friendly (Hey, says Big Food, we just make the stuff!) and because no one has any explanation for why nearly three-quarters of our population decided to become fat lazy gluttons all at once (my Muppet Show theory notwithstanding) or for the increase of obesity among preschool children (clearly not affected by the Muppet Show’s cancellation), public health pundits and media-appointed experts have decided that obesity is no longer a matter of personal responsibility. Instead the problem is our “obesogenic environment,” created by the Big Bad Fast Processed Fatty Salty Sugary Food Industry.

Even though it is usually understood that a balance between supply and demand creates what happens in the marketplace, Michael Pollan has argued that it is the food industry’s creation of cheap, highly-processed, nutritionally-bogus food that has caused the rapid rise in obesity. If you are a fan of Pollanomics, it seems obvious that food industry—on a whim?—made a bunch of cheap tasty food, laden with fatsugarsalt, hoping that Americans would come along and eat it. And whaddaya know? They did! Sort of like a Field of Dreams only with Taco-flavored Doritos.

As a result, obesity has become a major public health problem.

Just like it was in 1952.

Helen Lee in thought-provoking article, The Making of the Obesity Epidemic (it is even longer than one of my blog posts, but well worth the time) describes how our obesity problem looked then:

“It is clear that weight control is a major public health problem,” Dr. Lester Breslow, a leading researcher, warned at the annual meeting of the western branch of the American Public Health Association (APHA).
 At the national meeting of the APHA later that year, experts called obesity “America’s No. 1 health problem.”

The year was 1952. There was exactly one McDonald’s in all of America, an entire six-pack of Coca-Cola contained fewer ounces of soda than a single Super Big Gulp today, and less than 10 percent of the population was obese.

In the three decades that followed, the number of McDonald’s restaurants would rise to nearly 8,000 in 32 countries around the world,
sales of soda pop and junk food would explode — and yet, against the fears and predictions of public health experts, obesity in the United States hardly budged. The adult obesity rate was 13.4 percent in 1960. In 1980, it was 15 percent. If fast food was making us fatter, it wasn’t by very much.

Then, somewhat inexplicably, obesity took off.”

It is this “somewhat inexplicably” that has me awake at night gnashing my teeth.

And what is Government going to do about it?

I wonder how “inexplicable” it would be to Ms. Lee had she put these two things together:

(In case certain peoples have trouble with this concept, I’ll type this very slowly and loudly: I’m not implying that the Dietary Guidelines “caused” the rise in obesity; I am merely illustrating a temporal relationship of interest to me, and perhaps to a few billion other folks. I am also not implying that a particular change in diet “caused” the rise in obesity. My focus is on the widespread and encompassing effects that may have resulted from creating one official definition of “healthy food choices to prevent chronic disease” for the entire population.)

Right now we are hearing calls from every corner for the government to create or reform policies that will reign in industry and “slim down the nation.” Because we’d never tried that before, right?

When smoking was seen as a threat to the health of Americans, the government issued a definitive report outlining the science that found a connection between smoking and risk of chronic disease. Although there are still conspiracy theorists that believe that this has all been a Big Plot to foil the poor widdle tobacco companies, in general, the science was fairly straightforward. Cigarette smoking—amount and duration—is relatively easy to measure, and the associations between smoking and both disease and increased mortality were compelling and large enough that it was difficult to attribute them to methodological flaws.

Notice that Americans didn’t wait around for the tobacco industry to get slapped upside the head by the FDA’s David Kessler in the 1990s. Tobacco use plateaued in the 1950s as scientists began to publicize reports linking smoking and cancer. The decline in smoking in America began in earnest with the release of Smoking and Health: Report of the Advisory Committee to the Surgeon General in 1964. A public health campaign followed that shifted social norms away from considering smoking as an acceptable behavior, and smoking saw its biggest declines before litigation and sanctions against Big Tobacco  happened in the 1990s.

Been there, done that, failed miserably.

In a similar fashion, the 1977 Dietary Goals were the culmination of concerns about obesity that had begun decades before, joined by concerns about heart disease voiced by a vocal minority of scientists led by Ancel Keys. Declines in red meat, butter, whole milk and egg consumption had already begun in response to fears about cholesterol and saturated fat that originated with Keys and the American Heart Association—which used fear of fat and the heart attacks they supposedly caused as a fundraising tactic, especially among businessmen and health professionals, whom they portrayed as especially susceptible to this disease of “successful civilization and high living.”  The escalation of these fears—and declines in intake of animal foods portrayed as especially dangerous—picked up momentum when Senator George McGovern and his Select Senate Committee created the 1977 Dietary Goals for Americans. It was thought that, just as we had “tackled” smoking, we could create a document advising Americans on healthy food choices and compliance would follow. But issue was a lot less straightforward.

To begin with, when smoking was at its peak, only around 40% of the population smoked. On the other hand, we expect that approximately 100% of the population eats.

In addition, the anti-smoking campaigns of the 1960s and 1970s built on a long tradition of public health messages—originating with the Temperance movement—that associated smoking with dirty habits, loose living, and moral decay. It was going to be much harder to fully convince Americans that traditional foods typically associated with robust good health, foods that the US government thought were so nutritionally important that in the recent past they had been “saved” for the troops, were now suspect and to be avoided.

Where the American public had once been told to save “wheat, meat, and fats” for the soldiers, they now had to be convinced to separate the “wheat” from the “meat and fats” and believe that one was okay and the others were not.

To do this, public health leaders and policy makers turned to science, hoping to use it just as it had been used in anti-smoking arguments. Frankly, however, nutrition science just wasn’t up to the task. Linking nutrition to chronic disease was a field of study that would be in its infancy after it grew up a bit; in 1977, it was barely embryonic. There was little definitive data to support the notion that saturated fat from whole animal foods was actually a health risk; even experts who thought that the theory that saturated fat might be linked to heart disease had merit didn’t think there was enough evidence to call for dramatic changes in American’s eating habits.

The scientists who were intent on waving the “fear of fat” flag had to rely on observational studies of populations (considered then and now to be the weakest form of evidence), in order to attempt to prove that heart disease was related to intake of saturated fat (upon closer examination, these studies did not even do that).

Nutrition epidemiology is a soft science, so soft that it is not difficult to shape it into whatever conclusions the Consistent Public Health Message requires. In large-scale observational studies, dietary habits are difficult to measure and the results of Food Frequency Questionnaires are often more a product of wishful thinking than of reality. Furthermore, the size of associations in nutrition epidemiological studies is typically small—an order of magnitude smaller than those found for smoking and risk of chronic disease.

But nutrition epidemiology had proved its utility in convincing the public of the benefits of dietary change in the 70s and since then has become the primary tool—and the biggest funding stream (this is hardly coincidental)—for cementing in place the Consistent Public Health Message to reduce saturated fat and increase grains and cereals.

There is no doubt that the dramatic dietary change that the federal government was recommending was going to require some changes from the food industry, and they appear to have responded to the increased demands for low-fat,whole grain products with enthusiasm. Public health recommendations and the food fears they engendered are (as my friend James Woodward puts it) “a mechanism for encouraging consumers to make healthy eating decisions, with the ultimate goal of improving health outcomes.” Experts like Kelly Brownell and Marion Nestle decry the tactics used by the food industry of taking food components thought to be “bad” out of products while adding in components thought to be “good,” but it was federal dietary recommendations focusing above all else on avoiding saturated fat, cholesterol, and salt that led the way for such products to be marketed as “healthy” and to become acceptable to a confused, busy, and anxious public. The result was a decrease in demand for red meat, butter, whole milk and egg, and an increase in demand for low-saturated fat, low-cholesterol, and “whole” grain products. Minimally-processed animal-based products were replaced by cheaply-made, highly-processed plant-based products, which food manufacturers could market as healthy because, according to our USDA/HHS Dietary Guidelines, they were healthy.

The problem lies in the fact that—although these products contained less of the “unhealthy” stuff Americans were supposed to avoid—they also contained less of our most important nutrients, especially protein and fat-soluble vitamins. We were less likely to feel full and satisfied eating these products, and we were more likely to snack or binge—behaviors that were also fully endorsed by the food industry.

Between food industry marketing and the steady drumbeat of media messages explaining just how deadly red meat and eggs are (courtesy of population studies from Harvard, see above), Americans got the message. About 36% of the population believe that UFOs are real; only 25% believe that there’s no link between saturated fat and heart disease. We are more willing to believe that we’ve been visited by creatures from outer space than we are to believe that foods that humans have been eating ever since they became human have no harmful effects on health. But while industry has certainly taken advantage of our gullibility, they weren’t the ones who started those rumors, and they should not be shouldering all of the blame for the consequences.

Fixing it until it broke

Back in 1977, we were given a cure that didn’t work for diseases that we didn’t have. Then we spent billions in research dollars trying to get the glass slipper to fit the ugly stepsister’s foot. In the meantime, the food industry has done just what we would expect it to do, provide us with the foods that we think we should eat to be healthy and—when we feel deprived (because we are deprived)—with the foods we are hungry for.

We can blame industry, but as long as food manufacturers can take any mixture of vegetable oils and grain/cereals and tweak it with added fiber, vitamins, minerals, a little soy protein or maybe some chicken parts, some artificial sweeteners and salt substitutes, plus whatever other colors/preservatives/stabilizers/flavorizers they can get away with and still be able to get the right profile on the nutrition facts panel (which people do read), consumers–confused, busy, hungry–are going to be duped into believing what they are purchasing is “healthy” because–in fact–the government has deemed it so. And when these consumers are hungry later—which they are very likely to be—and they exercise their rights as consumers rather than their willpower, who should we blame then?

There is no way around it. Our dietary recommendations are at the heart of the problem they were created to try to reverse. Unlike the public health approach to smoking, we “fixed” obesity until it broke for real.

Processed Meats Declared Too Dangerous For Human Consumption

Processed meats have been declared too dangerous for human consumption by pseudo-experts who are unable to differentiate between observational studies and clinical trials, thus posing tremendous risks to the collective IQ of the interwebz reading public [1].

The World Cancer Research Fund recently completed a detailed review of 7,000 studies covering links between diet and cancer. A grand total of 11 of these were actual clinical trials that tested two different dietary approaches or supplementation on cancer outcomes. Two of these 11 trials tested a dietary intervention, both using a low-fat diet versus a usual diet control. Researchers found that, “The low fat dietary pattern intervention did not reduce the risk of invasive colorectal cancer in any of its subsites” [2]. In other words, avoiding fat in foods like bacon, sausage, pork chops, and pepperoni will not reduce your risk of colon cancer; however, it may reduce your enjoyment of life considerably, and that, in itself, is a pain in the butt.

Upon conclusion, it is evident that reading research summaries written by people who don’t know the difference between an observational study and a clinical trial is dangerous for human intellect and the acquisition of accurate information. Consumers should stop reading processed articles full of information pollution and should instead watch re-runs of Gilligan’s Island. 

What are processed meats?
Processed meats include bacon, sausage, hot dogs, sandwich meat, packaged ham, pepperoni, salami and nearly all meat found in prepared frozen meals. Processed meats are usually manufactured with an ingredient known as sodium nitrate, which is often linked to cancer by pseudo-experts who don’t know how to look up stuff in PubMed. Sodium nitrate is primarily used as a colour fixer by meat companies to make the packaged meats look bright red and fresh. Monosodium glutamate is also added on a regular basis to enhance the savoury flavour. An extra letter “u” added to words can also enhance colour and savoury flavour.

Sodium Nitrate has been strongly linked to the formation of cancer-causing nitrasamines [sic] in the human body, leading to a sharp increase in the risk of cancer for those consuming them. This is especially frightening, since as far as actual science goes, there is no such thing as a nitrasamine. Scientists are very concerned, however, about nitrosamines, which do, in fact, actually exist. Their concern reflects a growing body of evidence that people writing about nutrition on the internet actually have no idea about which they are ostensibly talking:

“There has been widespread discussion about health risks related to the amount of nitrate in our diet. When dietary nitrate enters saliva it is rapidly reduced to nitrite in the mouth by mechanisms discussed above. Saliva containing large amounts of nitrite is acidified in the normal stomach to enhance generation of N-nitrosamines, which are powerful carcinogens in the experimental setting. More recently, it has been suggested that nitric oxide in the stomach could also be carcinogenic. A great number of studies have been performed examining the relationship between nitrate intake and gastric cancer in humans and animals. In general it has been found that there is either no relationship or an inverse relationship, such that a high nitrate intake is associated with a lower rate of cancer. Recently, studies have been performed suggesting that not only is nitrate harmless but in fact it may even be beneficial. Indeed, acidified nitrite may be an important part of gastric host defense against swallowed pathogens. The results presented here further support the interpretation that dietary nitrate is gastroprotective. They also suggest that the oral microflora, instead of being potentially harmful, is living in a true symbiotic relationship with its host. The host provides nitrate, which is an important nutrient for many anaerobic bacteria. In return, the bacteria help the host by generating the substrate (nitrite) necessary for generation of nitric oxide in the stomach” [3].

A 2005 Hawaii University study found that reading articles about processed meats written by ninnies who can’t spell “nitrosamine” increased the risk of a 5-point IQ reduction by 67%, whilst another study found that it increased the risk of twerking by 50%. These are scary numbers for those consuming articles about processed meats on a regular basis.

Monosodium glutamate (MSG) is a second dangerous-sounding chemical found in virtually all processed meat products. MSG is thought by people who are unable to navigate PubMed to be a dangerous excitotoxin linked to neurological disorders such as migraine headaches, Alzheimer’s disease, loss of appetite control, obesity and unrestrained blogging. Nutrition bloggers use MSG to add a deceptively scientifical-sounding level of paranoia to their articles about the addictive savory flavor of dead-tasting processed meat products. This will deflect unwary readers’ attention away from inane and poorly-worded concepts such as “addictive savory flavor of dead-tasting processed meat products.” On the other hand, the Joint FAO/WHO Expert Committee on Food Additives, the Scientific Committee for Food of the European Commission, the Federation of American Societies for Experimental Biology, and the Food and Drug Administration all concluded that, although there may be a subpopulation of people sensitive to its effects, no health risk have been found to be associated with MSG [4]. But what do they know?

Food items to check carefully for aliveness before piling them into your cart:

  • Beef jerky
  • Bacon
  • Sausage
  • Pepperoni
  • Hot dogs
  • Sandwich meat
  • Deli slices
  • Ham

…and many more meat products

If it’s so dangerous to consume such stupidity, why are they allowed to write it?

Unfortunately nowadays, access to operational brain cells is not a prerequisite for access to a keyboard and a WordPress account. That and First Amendment concerns have allowed unsuspecting readers curious about the real health effects of some food components to be misled, confused, and frightened by the insidious repetition of poorly-researched half-truths written by bloggers with a frail grasp on reality and an affinity for really big words that they don’t quite know the meaning of, like nitrso , um, nitarsa, um, nirstirammidngieaygyieg.

Unfortunately, these bloggers seem to hold tremendous influence over the blogosphere, and as a result consumers have little protection from dangerous propaganda intentionally added to internet, even in places that aren’t Reddit.

To avoid the dangers of idiot bloggers writing about processed meats:

  • Always read primary sources for yourself. If there are no primary sources, leave a pleasantly snarky comment to that effect on the blog site and never go there again.
  • Don’t read any articles about sodium nitrate or MSG from bloggers who don’t know how to spell “nitrosamine.”
  • Avoid eating red meats served by restaurants, schools, hospitals, hotels or other institutions without asking for it to be served thick and juicy, just the way you like it. This will give you the courage and moral fortitude to look up stuff yourself on PubMed, without having to rely on bloggers who don’t know how to spell “nitrosamine.”
  • If you are fixated on fresh something, be fixated on Fresh Prince.
  • Avoid processed blog material as much as possible
  • Spread the word and tell others about the dangers of reading idiot blogs about the dangers of sodium nitrate and MSG

Vitamin C naturally found in lime juice that has been gently squeezed into a tumbler of tequila has been shown to help prevent the formation of permanent facepalms after accidently ingesting an idiot nutrition blog and can help protect you from the devastating IQ-lowering effects of blobbers who cant spll. The best defense of course is to avoid the interwebz all together and go dancewalking.


Sources:

  1. http://hollyleehealth.com/2013/04/02/processed-meats-declared-too-dangerous-for-human-consumption/
  2. http://www.wcrf.org/PDFs/Colorectal-cancer-CUP-report-2010.pdf
  3. http://www.jci.org/articles/view/19019

The NaCl Debacle Part 1: Salt makes you fat?

Don’t look now, but I think the Institute of Medicine’s new report on sodium just bitch-slapped the USDA/HHS 2010 Dietary Guidelines.

In case you have a life outside of the nutritional recommendation roller derby, the IOM recently released a report that comes to the conclusion that restricting sodium intake to 1500 mg/day may increase rather than reduce health risks. Which is a little weird, since the 2010 Dietary Guidelines did a great job of insisting that any American with high blood pressure, all blacks, and every middle-aged and older adult—plus anyone who has ever eaten bacon or even thought about eating bacon, i.e. nearly everybody—should limit their salt intake to 1500 mg of sodium a day, or less than ¾ of a teaspoon of salt. The American Heart Association was, of course, aghast. The AHA thinks EVERYBODY should be limited to less than ¾ teaspoon of salt a day, including people who wouldn’t even think about thinking about bacon.

Why are the AHA and USDA/HHS so freaked out about salt?  And how did the IOM reach such a vastly different conclusion than that promoted by the AHA and the Dietary Guidelines?  Fasten your seat belts folks, it’s gonna be a bumpy blog.

First, it is helpful to examine why the folks at AHA and USDA/HHS are so down on salt.  The truth: we have no freakin’ idea. Salt has been around since what, the dawn of civilization maybe? It is an essential nutrient, and it plays an important role in preserving food and preventing microbial growth (especially on bacon). But Americans could still be getting too much of a good thing. Everybody at the AHA seems to think that Americans consume “excessive amounts” of sodium. (Of course, just about anything looks excessive compared to less than ¾ of a teaspoon.) But do we really consume too much sodium?

Back in 2010, Dr. Laurence I-Know-More-About-Sodium-Than-Your-Kidneys-Do Appel (or as his friends call him, “Low-Sodium Larry”), one of the leading advocates for a salt-free universe, acknowledged that “The data is quite murky. We just don’t have great data on sodium trends over time. I wish that we did. But I can’t tell you if there’s been an increase or decrease.”

Well, Low-Sodium Larry, I can, and I am about to make your wish come true.

According to recent research done by that wild bunch of scientific renegades at Harvard, in the past 60 years sodium intake levels have . . .drumroll, please . . .  not done much of anything.

Hey, that doesn’t sound right! Everyone knows that it is virtually impossible to get an accurate measure of sodium intake from dietary questionnaires; people are probably just “under-reporting” their salt intake like they “under-report” everything else. Low-Sodium Larry has previously insisted that one of the reasons the data is so murky is that few epidemiological studies measure sodium intake accurately and that, “really, you should do 24-hour urinary sodium excretions to do it right.”

The guys at Harvard looked at studies that did it right.  This systematic analysis of 38 studies from the 1950s to the present, found that 24-hour urinary sodium excretion (the “gold” standard—omg, I could not resist that—of dietary sodium intake estimation) has neither increased nor decreased, but has remained essential stable over time. Despite the fact that Americans are apparently hoovering up salt like Kim Kardashian hoovers up French fries—and with much the same results, i.e. puffing up like a Macy’s Thanksgiving Day balloon—for whatever reason we simply aren’t excreting more of it in our urine.

According to that same study however, despite the lack of increase in sodium excretion (which is supposed to accurately reflect intake—but that can’t be right), high blood pressure rates in the population have been increasing. Duh. Everyone knows that eating lots of salt makes your blood pressure go up. But have the rates of high blood pressure in America really been going up?

Age-Adjusted Prevalence of Hypertension (2009 NIH Chart Book)

Well, no.  Not really. The Harvard dudes cite a report that goes back to 1988-1994 data, and yes, rates of high blood pressure have been creeping slowly back up since then. This is because from 1976-1980 to 1988-1994, rates of high blood pressure plummeted for most segments of the American population.

We don’t know why rates of high blood pressure fell during the 70s and early 80s. It may have been that the Dietary Guidelines told people to eat more potassium-filled veggies and people actually tried to follow the Dietary Guidelines, which would have had a positive effect on high blood pressure. On the other hand, it could have been largely due to the sedating influence of the soft rock music of that era blanketing the airwaves with the mellow tones of England Dan and John Ford Coley, Christopher Cross, Ambrosia, and the like (youtube it, you young whippersnappers out there). We also don’t know why rates are going back up. Rising rates of obesity may be part of the problem, but it is also entirely possible that piping the Monsters of Lite Rock through every PA system in the country might save our health care system a lot of time and trouble.

This is what we (think we) know:

  • High-sodium diets might possibly maybe sometimes be a contributor to high blood pressure.
  • Rates of high blood pressure are going (back) up.
  • Obesity rates are definitely going up.

Ergo pro facto summa cum laude, it is clear—using the logic that seems to undergird the vast majority of our public health nutrition recommendations—salt makes you fat.  The USDA/HHS has been faced with rapidly rising rates of obesity which, until now, they have only been to pin on the laziness and gluttony of Americans.  But if salt makes us fat, that might explain why the USDA/HHS doesn’t want us to eat it.

After all, the biomechanics of this is pretty straightforward. If you eat too much sodium (which we must be), but you don’t pee it out (which we aren’t), you must be retaining it and this is what makes your blood pressure and your weight both go way up. They didn’t really cover the physics of this in my biochemistry classes so you’ll have to ask Dr. Appel how this works because he knows more about sodium than your kidneys do. But I think it must be true. After all, this is the mechanism that explains the weight loss behind carbohydrate-reduced diets, right? I myself reduced my carb intake and lost 60 pounds of water weight!

And besides, taking the salt out of our food will give food manufacturers the opportunity to make food more expensive and tasteless while adding synthetic ingredients whose long-term effects are unknown—just what the American consumer wants!

For a while there, we thought the whole idea was to reduce sodium in order to reduce blood pressure in order to reduce diseases of the circulatory system, like heart failure, stroke, and coronary heart disease . That didn’t seem to work out so well, because the whole time that sodium intake was staying stable (if we want to believe the urinary sodium excretion data) and high blood pressure rates were going down (although they are starting to go back up), rates of those diseases have gone up:

Age-Adjusted Prevalence of Heart Failure (2009 NIH Chart Book)

Age-Adjusted Prevalence of Stroke (2009 NIH Chart Book)

Age-Adjusted Prevalence of Coronary Heart Disease (2007 NIH Chart Book)

So if reducing blood pressure to reduce cardiovascular disease isn’t the answer, then we must need to reduce blood pressure to reduce obesity! By jove, I think we’ve got it!

The USDA/HHS must have known the “salt makes you fat” notion would be a tough sell, I mean, what with the lack of any shred of supporting science and all that. (But then, the “salt causes high blood pressure which causes cardiovascular disease” argument hasn’t exactly been overburdened by evidence either, and that never seemed to stop anyone.) So the 2010 Dietary Guidelines brought together the American Heart Association’s Superheroes of Sodium Slashing, Low-Sodium Larry and his bodacious salt-subduing sidekick, Linda Van Horn, both of whom had been preaching the gospel of sodium-reduction as a preventive health measure with little conclusive evidence to support their recommendations.  The USDA/HHS knew that with Linda and Larry on the team, it didn’t matter how lame the science, how limited the data, or how ludicrous the recommendation, these two could be counted on to review any and all available evidence and reliably come up with the exact same concrete and well-proven assumptions they’d been coming up with for years.

The Sodium-Slashing Superheroes–Drs. Lawrence Appel and Linda Van Horn– ready to make the world safe for bland, unappetizing food everywhere! (Drawings courtesy of Butcher Billy)

So here’s the cliffhanger:  Will Linda and Larry be able to torture the science on salt into confessing its true role in the obesity crisis?

Tune in tomorrow, when you’ll hear Linda and Larry say: “Science? We don’t need no stinkin’ science.”

N of 1 Nutrition Part 1: Same Old Tools

I’ve been thinking a lot about tools lately.  This actually has nothing to with the ongoing fascinating-in-a-train-wreck-sort-of-way paleo soap opera, although I have been reading Audre Lorde’s essay “The Master’s Tools will Never Dismantle the Master’s House” and loving it.  I have all kinds of things to say about feminism and nutrition (yeah, I’m going to go there), but there are all kinds of tools and we’re going to have to talk about all of them eventually.  Today, I’ll start with the scientific kind.  

At Ancestral Health Symposium 2012 there was, among other things, a great deal of discussion about what diet works “best:” primal, paleo, neopaleo (my friend Andrea invented that one), safe starch, low-carb, no-carb, etc. The reality is that, in terms of being able to make sweeping generalizations about which dietary pattern will work best for everyone, we as nutrition scientists and clinicians actually sorta suck. Other than describing very general recommendations for essential nutrition—amino acids, fatty acids, vitamins and minerals, and even these have a wide variability in individual requirements—we simply do not have the skills, the tools, or the knowledge to make sweeping dietary recommendations that do not come with the very real possibility of unintended negative consequences for an individual who might follow them.

Choline is a great example of what happens when you mix individual variation with universal recommendations:

Although our body makes some choline, we still require a dietary supply of this important nutrient.* Eggs are a primary source of dietary choline. The past 30 years of Dietary Guidelines have frightened us into reducing egg consumption and/or using egg substitutes that replace the yolk (where the choline is) with soybean oil in order to prevent heart disease, even though dietary cholesterol has little effect on serum cholesterol [1] and our average cholesterol intake is below recommended levels and has been for 40 years [2]. Nevertheless, egg yolks, a recent headline screamed, are as bad for you as cigarettes.

In response to these scare tactics, Americans have dramatically reduced their egg consumption [3]. As a result, average choline consumption does not meet current recommended standards; less than 4% of women even reach adequate intake levels [4, 5].

This is bad enough, but these adequate intake levels were based on a small study done on adult white males; standards for everyone else, including children, were extrapolated from those results [6]. Post-menopausal females, pregnant women, children, and people with certain genetic polymorphisms (which may exist in more than 50% of the population) may actually have increased needs for choline above and beyond the adequate intake level [7].

It’s hard to say exactly how large the gap between intake and actual needs are for these subpopulations, but I can hazard a guess that as long as whole eggs are discouraged as part of our diets, it will only continue to widen. The fact that dietary choline is needed for the development of  brain cells seems rather ironic in the face of such goofiness.

Brain food? Or death by cholesterol?

When dietary guidance shifted from being about provision of basic nutrition to prevention of chronic disease, we found ourselves using tools that were designed to examine diseases of nutrition deficiency (i.e. diseases with one fairly straightforward cause), to now make recommendations about chronic diseases with long, complex, multi-factorial origins [8]. Everyone deprived of Vitamin C will eventually develop scurvy, but not everyone who avoids cholesterol will also avoid heart disease.  Chronic diseases that result from a complex interplay between the individual and environment are difficult—if not impossible—to examine using our current tools and methods, and assessing an individual’s risk of heart disease and tailoring dietary guidance accordingly is much different from making population-wide recommendations to avoid a food–in this case, eggs–that is a primary source of an essential nutrient.

Our current approach takes the complex reality that is one individual human living his/her life and

  • dials into a discrete mechanism within this complex unit using cell cultures and animal models that can’t even begin to describe the physiological, psychological, and cultural context of a whole complicated individual (nutritional biochemistry), or
  • lumps a complicated individual into a pile with a lot of other complicated individuals and uses a fancy schmantzy computer program or a highly-controlled artificial experimental protocol to paint an simplified, homogenized broad brush stroke of a picture that bears little resemblance to the reality of any of the specific individuals it is supposed to describe (nutrition epidemiology), and then
  • turns these overly-simplified, homogenized descriptions into one-size-fits-all nutrition policy that has never actually been shown to work.

From reality to policy: Four perspectives on nutrition

Everyone is subject to the same biochemical rules—and it’s great to learn more about how these rules work on a mechanistic level—but how those rules play out in any given individual is difficult to predict. Is there a way to use the focus of an experimental intervention without losing the environmental influences present in observational studies, and still create something that will eventually translate into meaningful policy?

Maybe. In next few posts, I take on some of the shortcomings in our current methodology and explore an approach that may help move nutrition science, and thus nutrition policy, into the 21st century.

*Choline acts as a methyl donor in pathways involving gene expression and other metabolic functions; as an important contributor to structural integrity and signaling function in cell membranes, especially those involved in nervous tissue and brain development; as a necessary constituent of lipid metabolism and transport, including VLDL required for the export of fat from the liver; and as the precursor to the neurotransmitter, acetylcholine.

References:

1. Willett, Walter. Nutrition Epidemiology, 2nd edition. 1988.

2. U.S. Department of Agriculture. Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans 2010. Accessed July 15, 2010. http://www.cnpp.usda.gov/DGAs2010-DGACReport.htm

3. U.S. Dept. of Agriculture, Office of Communications. 2001-2002 Agriculture Fact Book. Washington, DC:2003.

4. Jensen H. Choline in the diets of the US population: NHANES, 2003-2004. The FASEB journal: official publication of the Federation of American Societies for Experimental Biology. 2007;21(Meeting Abstract Supplement):lb219.

5. Moshfegh A. Usual Nutrient Intakes of Americans. USDA Whitten Building; 2009.

6. Dietary Reference Intakes for Thiamin, Riboflavin, Niacin, Vitamin B6, Folate, Vitamin B12, Pantothenic Acid, Biotin, and Choline [Internet]. [cited 2012 May 21]. Available from: http://books.nap.edu/openbook.php?record_id=6015

7. Zeisel SH, da Costa K-A. Choline: An Essential Nutrient for Public Health. Nutr Rev. 2009 Nov;67(11):615–23.

8. Harper AE. Killer French Fries. Sciences 1988, 28 (Jan/Feb): 21-27.


The Mobius Strip of Policy Change


I love working with individuals, but it takes policy-level change to really make an impact on public health. Policy, however, is a double-edged sword. Decades-long cascades of unintended consequences can arise from well-intentioned policy. The Dietary Guidelines started out in 1980 as an unmandated humble little 40-page booklet offering nutrition guidance to the public, while freely admitting that “we don’t know enough about nutrition to identify an “ideal” diet for each individual” and that “in those chronic conditions where diet may be important . . . the roles of specific nutrients have not been defined.”

Since then, I’m still not sure how, the Dietary Guidelines have become the center of all information and decision-making surrounding food and nutrition in America—in policy, healthcare, industry, media, and science (where researchers should know better than to use a policy document as the basis for scientific research). And—for better or worse—Americans have actually shifted their eating habits to fall in line with Guidelines recommendations (see: Americans don’t follow the Guidelines—or do they?)


The Guidelines were created to prevent chronic disease.  They have changed very little in 30 years, while rates of obesity, diabetes, and other chronic disease have rapidly increased (see: Public Health Nutrition’s Epic Fail). Currently, there is no “policy lever” for changing the way the Guidelines are created or administered. The Guidelines have no system of checks and balances, no outcome evaluation process, and no way to counter the influence of entrenched special interests (including both the food and science industries).

Right now, it seems that no amount of public outcry, accumulation of scientific evidence otherwise, or increase in diseases the Guidelines were meant to prevent can shift them from their current staked position that a high-carbohydrate, high-fiber, low-fat, low-cholesterol, low-saturated fat, low-sodium diet is right for all Americans. Under the USDA/HHS “calories in, calories out” paradigm, it’s Americans that need to change (“eat less and move more”), not nutrition policy. Policy changes are urged only to “make the healthy choice the easy choice”  for fat stupid Americans (especially low-income ones) who apparently otherwise don’t care and can’t think.

I would expect such policy reform to have, as Jon Stewart put it, “the draconian government overreach we all love with the probable lack of results we expect.”

So what kind of policy reform should we be working towards? One of the Big Questions I ponder is whether we need to replace the current USDA/HHS Dietary Guidelines with “better” ones, or find a different way to create nutrition policy, or just ditch all government-sanctioned nutritional recommendations altogether. (Other Big Questions: What’s for dinner? and How can I further embarrass my children?)

I don’t fundamentally oppose or support government-funded nutrition programs. If they were administered differently, I might like them a lot more. If we are going to use government funds to feed people, we will need some way of guiding that process. Right now, our federally-funded nutrition programs have a tendency to serve as outlets for cheap industrialized food, and I’m afraid that our nutrition guidance has not only allowed, but encouraged that role. On the other hand, scrapping that guidance altogether may leave government programs that are struggling for funds vulnerable to choosing food from the lowest bidder, which would only serve to reinforce the current situation.

I also have problems with replacing one-size-fits-all Guidelines with different one-size-fits-all Guidelines because that process denies the very real variability in nutritional needs and preferences of individuals and diverse sub-populations. Worse yet, it teaches people that answers about nutrition come from packages and experts rather than the body’s response to food.

As a transition, or middle ground, I currently favor the idea of locally-determined nutritional policies and programs. Sounds good, right? Nutrition programs could be tailored to meet the needs of the community they serve.

But this is where the confluence of things needed to make this type of policy shift happen turns into a Dilbert cartoon. Everything that needs to happen requires something else to happen first until it all loops back on itself like a Mobius strip.


Let’s take school lunches.  

Ideally, the type of school lunches served should be determined by the members of the community eating them, i.e. the kids, parents, teachers, etc.  This allows for appropriate community-level health, ethnic, cultural, regional, seasonal, and economic adjustments and prevent fiascos like the Los Angeles lunchroom garbage cans filled with “healthy” lunches (like “brown rice cutlets”).

Ideally, a trained professional at the local level, for instance an RD, would be able to guide this process, balancing the nutritional needs of that specific community with other social and cultural factors, creating an affordable menu, and modifying the program based on outcomes.  But this would mean that the RD would have to have training across the spectrum of nutrition science, rather just following USDA/HHS policy statements which are based on research done on white (frequently male) adults circa 1970-1980 and which may not be applicable to other populations.

This in turn would require the nutrition curriculum for health professionals to not be skewed by entrenched interests in academics, politics, and industry (and would probably require almost a complete re-thinking of 30 years of nutrition epidemiology).

This would require the USDA/HHS and other institutions to support–through funding, publication, and use—nutrition research that may possibly undermine or even contradict 30 years of previous nutritional guidance. This research would not only provide a knowledge base for health professionals, but would provide an unbiased source of information for consumers which would help to create informed stakeholders in the nutrition-food system.

At the same time, industry, producers, and growers would have to work with the community to make foods available that meet the demands of the local program at a reasonable cost.  And right now—due to agricultural practices and USDA policies—foods that are widely and cheaply available to federal nutrition programs are the ones that the USDA/HHS Guidelines have determined are “healthy” even though this definition of “healthy” seems to be based, at least in part, on whether or not those foods are widely and cheaply available for federal nutrition programs.

See what I mean?  I have a hard time figuring out where we need to insert the monkey-wrench that will stop the endless cogs from turning out the same policies, practices, and programs that have been radically unsuccessful for the past 30 years.

Which won’t, of course, stop me from trying.

As I’ve been working with Healthy Nation Coalition and tossing ideas around with people who are also working on this issue, I’ve found some that I believe are fundamental to fixing our food-health system. These concepts originated with people much smarter than me, but I am hoping that in my academic work and in our non-profit work at Healthy Nation Coalition, I will have the opportunity to be a part of developing them further:

1) N of 1 Nutrition – a movement towards more individualized nutrition, although the “1” can also be a family, community, or other subpopulation

2) Nutritional Literacy – a movement to foster an understanding of the cultural forces that shape our nutritional beliefs and our relationships to food and food communities

3) Open Nutrition – a movement to raise awareness regarding the laws, policies, institutions, and other social, economic and cultural forces that may impact access to nutrition information and development of sustainable systems that produce foods that support health

It takes about 30 years for any given scientific paradigm to shift. It is time. But how will we do it differently? I think these concepts are the “next steps” that will help us steer the next 30 years of nutrition in a direction that may help us avoid another cascade of unintended consequences down the road. More on each soon.


Public Health Nutrition’s Epic Fail

Mostly I just wanted to say “epic fail” because it embarrasses my kids, but then, they are always harshing on my mellow.

The stated goals of the US Dietary Guidelines are to promote health, reduce risk of chronic disease, and reduce the prevalence of overweight and obesity.

How’s that working for us?

First the good news. Cholesterol levels and hypertension have trended downwards since the creation of our first Dietary Guidelines.

It is possible that the changes in these risk factors reflect a trend that was already well underway when the Dietary Guidelines were written . . .

. . . although some folks like to attribute the changes to improvements in our eating habits (Hu et al 2000; Fung et al 2008). And btw, yes, they actually have improved with regards to the dietary recommendations set for in our Guidelines. Don’t believe me? You’re not alone. Here’s the data.

Soooooo . . . if our diets really have improved, and if those improvements have led to related improvements in some disease risk factors (because cholesterol levels and even blood pressure levels are not diseases in and of themselves, but markers—or risk factors—for other disease outcomes, like heart disease and stroke), let’s see how the Guidelines fared with regards to actual disease.

This trend is a little ironic in that cancer was, at first, one of the primary targets for nutrition reform. It was Senator George McGovern’s ire at the Department of Health, Education, and Welfare’s (now the Department of Health and Human Services) failure to aggressively pursue nutritional links to cancer that was at least part of the motivation behind giving the “lead” in nutrition to the USDA in 1977 (Eskridge 1978; Blackburn, Interview with Mark Hegsted). In fact the relationship between dietary fat and cancer had so little solid evidence behind it, the 2000 Dietary Guidelines Advisory Committee had this to say: “Because relationships between fat intake and cancer are inconclusive and currently under investigation, they are deleted.”

I guess we can then feel assured that the reason that the restrictions against fat and saturated fat are still in the Dietary Guidelines is because their relationship to heart disease isn’t inconclusive or “currently under investigation”? If that’s the case, somebody better tell these folks. So what did happen to heart disease as we lowered our red meat consumption and our egg intake, while we increase our intake of “heart-healthy” grains and vegetable oils?

Well, you’d think with all of that reduction in fat and saturated fat, plus the decrease in smoking, we’d be doing better here, but at least—well, at least for white people—the overall trend is down; for black folks, the overall trend is up.

Oops. Not so good.

Hmmm.

Oh. Well. This can’t be good. And of course, my favoritest graph of all:

I’m not sure, but it sorta kinda looks like the Dietary Guidelines haven’t really prevented much, if any, disease. Maybe we could get those guys at Harvard to take a closer look? I mean, looking at these trends—and using the language allowed with associations—you might say that the development and implementation of Dietary Guidelines for Americans is associated with a population-wide increase in the development of cancer, heart failure, stroke, diabetes, and overweight/obesity. Anyway, you might say that. I would never say that. I’m an RD.

Are there other explanations for these trends? Maybe. Maybe not.

It’s always a good idea to blame food manufacturers, but we have to remember that they pretty much supply what we demand. And in the past 30 years, what we’ve demanded is more “heart-healthy” grains, less saturated fat, and more Poofas. Yes, food manufacturers do help shape demand through advertising, but the Dietary Guidelines don’t have anything to do with that.

Oh yeah. That‘s so whack, it’s dope.

References:

Blackburn H. Interview with Mark Hegsted. “Washington—Dietary Guidelines.” Accessed January 24, 2011. http://www.foodpolitics.com/wp-content/uploads/Hegsted.pdf

Centers for Disease Control and Prevention (CDC). National Center for Health Statistics, Division of Health Interview Statistics, data from the National Health Interview Survey. http://www.cdc.gov/diabetes/statistics/prev/national/figpersons.htm. Accessed 15 August 2010.

Centers for Disease Control and Prevention (CDC). National Center for Health Statistics, Division of National Health and Nutrition Examination Surveys. Prevalence of Overweight, Obesity, and Extreme Obesity Among Adults: United States, Trends 1976–1980 Through 2007–2008. http://www.cdc.gov/NCHS/data/hestat/obesity_adult_07_08/obesity_adult_07_08.pdf

Accessed February 1, 2011.

Eskridge NK. McGovern Chides NIH: Reordering Priorities: Emphasis on Nutrition. BioScience, Vol. 28, No. 8 (August 1978), pp. 489-491.

Fast Stats: An interactive tool for access to SEER cancer statistics. Surveillance Research Program, National Cancer Institute. http://seer.cancer.gov/faststats. Accessed on 11-1-2011.

Fung TT, Chiuve SE, McCullough ML, Rexrode KM, Logroscino G, Hu FB. Adherence to a DASH-style diet and risk of coronary heart disease and stroke in women. Arch Intern Med. 2008 Apr 14;168(7):713-20. Erratum in: Arch Intern Med. 2008 Jun 23;168(12):1276.

Hu FB, Stampfer MJ, Manson JE, Grodstein F, Colditz GA, Speizer FE, Willett WC. Trends in the incidence of coronary heart disease and changes in diet and lifestyle in women. N Engl J Med. 2000 Aug 24;343(8):530-7.

Morbidity and Mortality: 2009 Chart Book on Cardiovascular, Lung, and Blood Diseases. Bethesda, Md: National Institutes of Health: National Heart, Lung, and Blood Institute; 2009.