As the Calories Churn (Episode 2): Honey, It’s Not the Sugar

In the previous episode of As the Calories Churn, we looked at why it doesn’t really make sense to compare the carbohydrate intake of Americans in 1909 to the carbohydrate intake of Americans in 1997.  [The folks who read my blog, who always seem to be a lot smarter than me, have pointed out that, besides not being able to determine differing levels of waste and major environmental impacts such as a pre- or early-industrial labor force and transportation, there would also be significant differences in:  distribution and availability; what was acquired from hunted/home-grown foods; what came through the markets and ended up as animal rather than human feed; what other ingredients these carbohydrates would be packaged and processed with; and many other issues.  So in other words, we not comparing apples and oranges; we are comparing apples and Apple Jacks (TM).]

America in 1909 was very different from America in 1997, but America in 1970 was not so much, certainly with regard to some of the issues above that readers have raised.  By 1970, we had begun to settle into post-industrial America, with TVs in most homes and cars in most driveways.  We had a wide variety of highly-processed foods that were distributed through a massive transportation infrastructure throughout the country.

Beginning in the mid-1960s, availability of calories in the food supply, specifically from carbohydrates and fats had begun to creep up.  So did obesity.  It makes sense that this would be cause for concern from public health professionals and policymakers, who saw a looming health crisis ahead if measures weren’t taken–although others contended that our food supply was safer and more nutritious than it had ever been and that public health efforts should be focused on reducing smoking and environmental pollutants.

What emerged from the political and scientific tug-of-war that ensued (a story for another blog post) were the 1977 Dietary Goals for Americans.  These goals told us to eat more grains, cereals and vegetable oils and less fat, especially saturated fat.

Then, around 1977 – 1980, in other words around the time of the creation of the USDA’s recommendations to increase our intake of grains and cereals (both carbohydrate foods) and to decrease our intake of fatty foods, we saw the slope of availability of carbohydrate calories increase dramatically, while the slope of fat calories flattened–at least until the end of the 1990s (another story for another blog post).

[From food availability data, not adjusted for losses.]

The question is:  How did the changes in our food supply relate to the national dietary recommendations we were given in 1977?  Let’s take a closer look at the data that we have to work with on this question.

Dear astute and intelligent readers: From this point on, I am primarily using loss-adjusted food availability data rather than food availability data. Why? Because it is there, and it is a better estimate of actual consumption than unadjusted food availability data. It only goes back to around 1970, so you can’t use it for century-spanning comparisons, but if you are trying to do that, you’ve probably got another agenda besides improving estimation anyway. [If the following information makes you want to go back and make fun of my use of unadjusted food availability data in the previous post, go right ahead. In case you didn’t catch it, I think it is problematic to the point of absurdity to compare food availability data from the early 1900s to our current food system—too many changes and too many unknowns (see above).  On the other hand, while there are some differences, I think there are enough similarities in lifestyle and environment (apart from food) between 1970 and 2010 to make a better case for changes in diet and health being related to things apart from those influences.]

Here are the differences in types of food availability data: 

Food availability data: Food availability data measure the use of basic commodities, such as wheat, beef, and shell eggs for food products at the farm level or an early stage of processing. They do not measure food use of highly processed foods– –in their finished form. Highly processed foods–such as bakery products, frozen dinners, and soups—are not measured directly, but the data includes their less processed ingredients, such as sugar, flour, fresh vegetables, and fresh meat.

Loss-Adjusted Food Availability: Because food availability data do not account for all spoilage and waste that accumulates in the marketing system and is discarded in the home, the data typically overstate actual consumption. Food availability is adjusted for food loss, including spoilage, inedible components (such as bones in meat and pits in fruit), plate waste, and use as pet food.

The USDA likes to use unadjusted food availability data and call it “consumption” because, well: They CAN and who is going to stop them?

The USDA—and some bloggers too, I think—prefer unadjusted food availability data.  I guess they have decided that if American food manufacturers make it, then Americans MUST be eating it, loss-adjustments be damned. Our gluttony must somehow overcome our laziness, at least temporarily, as we dig the rejects and discards out of the landfills and pet dishes—how else could we get so darn fat?

I do understand the reluctance to use dietary intake data collected by NHANES, as all dietary intake data can be unreliable and problematic  (and not just the kind collected from fat people).  But I guess maybe if you’ve decided that Americans are being “highly inaccurate” about what they eat, then you figure it is okay be “highly inaccurate” right back at Americans about what you’ve decided to tell them about what they eat.  Because using food availability data and calling it “consumption” is to put it mildly, highly inaccurate, by a current difference of over 1000 calories.

On the other hand, it does sound waaaaaay more dramatic to say that Americans consumed 152 POUNDS (if only I could capitalize numbers!) per person of added sweeteners in 2000 (as it does here), than it does to say that we consumed 88 pounds per person that year (which is the loss-adjusted amount). Especially if you are intent on blaming the obesity crisis on sugar.

Which is kinda hard to do looking at the chart below.

Loss adjusted food availability:

Calories per day 1970 2010
Total 2076 2534 +458
Added fats and oils 338 562 +224
Flour and cereal products 429 596 +167
Poultry 75 158 +83
Added sugars and sweeteners 333 367 +34
Fruit 65 82 +17
Fish 12 14 +2
Butter 29 26 -3
Veggies 131 126 -5
Eggs 43 34 -9
Dairy 245 232 -13
Red meat* 349 267 -82
Plain whole milk 112 24 -88

*Red meat: beef, veal, pork, lamb

Anybody who thinks we did not change our diet dramatically between 1970 and the present either can’t read a dataset or is living in a special room with very soft bouncy walls. Why we changed our diet is still a matter of debate. Now, it is my working theory that the changes that you see above were precipitated, at least in part, by the advice given in the 1977 Dietary Goals for Americans, which was later institutionalized, despite all kinds of science and arguments to the contrary, as the first Dietary Guidelines for Americans in 1980.

Let’s see if my theory makes sense in light of the loss-adjusted food availability data above (and which I will loosely refer to as “consumption”).  The 1977 [2nd Edition] Dietary Goals for Americans say this:

#1 – Did we increase our consumption of grains? Yes. Whole? Maybe not so much, but our consumption of fiber went from 19 g per day in 1970 to 25 g per day in 2006 which is not much less than the 29 grams of fiber per day that we were consuming back in 1909 (this is from food availability data, not adjusted for loss, because it’s the only data that goes back to 1909).

The fruits and veggies question is a little more complicated. Availability data (adjusted for losses) suggests that veggie consumption went up about 12 pounds per person per year (sounds good, but that’s a little more than a whopping half an ounce a day), but that calories from veggies went down. Howzat? Apparently Americans were choosing less caloric veggies, and since reducing calories was part of the basic idea for insisting that we eat more of them, hooray on us. Our fruit intake went up by about an ounce a day; calories from fruit reflects that. So, while we didn’t increase our vegetable and fruit intake much, we did increase it. And just FYI, that minuscule improvement in veggie consumption didn’t come from potatoes. Combining fresh and frozen potato availability (adjusted for losses), our potato consumption declined ever so slightly.

#2 – Did we decrease our consumption of refined sweeteners? No. But we did not increase our consumption as much as some folks would like you to think. Teaspoons of added (caloric) sweeteners per person in our food supply (adjusted for waste) went from 21 in 1970 to 23 in 2010.  It is very possible that some people were consuming more sweeteners than other people since those numbers are population averages, but the math doesn’t work out so well if we are trying to blame added sweeteners for 2/3 of the population gaining weight.  It doesn’t matter how much you squint at the data to make it go all fuzzy, the numbers pretty much say that the amount of sweeteners in our food supply has not dramatically increased.

#3 – Did we decrease our consumption of total fat? Maybe, maybe not—depends on who you want to believe. According to dietary intake data (from our national food monitoring data, NHANES), in aggregate, we increased calories overall, specifically from carbohydrate food, and decreased calories from fat and protein. That’s not what our food supply data indicate above, but there you go.

Change in amount and type of calories consumed from 1971 to 2008
according to dietary intake data

There is general agreement , however, from both food availability data  and from intake data, that we decreased our consumption of the saturated fats that naturally occur with red meat, eggs, butter, and full-fat milk (see below), and we increased our consumption of “added fats and oils,” a category that consists almost exclusively of vegetable oils, which are predominantly polyunsaturated and which were added to foods–hence the category title–such as those inexpensive staples, grains and cereals, during processing.

#4 – Did we decrease our consumption of animal fat, and choose “meat, poultry, and fish which will reduce saturated fat intake”? Why yes, yes we did. Calories from red meat—the bearer of the dreaded saturated fat and all the curses that accompany it—declined in our food system, while poultry calories went up.

(So, I have just one itty-bitty request: Can we stop blaming the rise in obesity rates on burgers? Chicken nuggets, yes. KFC, yes. The buns the burgers come on, maybe. The fries, quite possibly. But not the burgers, because burgers are “red meat” and there was less red meat—specifically less beef—in our food supply to eat.)

Michael Pollan–ever the investigative journalist–insists that after 1977, “Meat consumption actually climbed” and that “We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.”   In the face of such a concrete and well-proven assumption, why bother even  looking at food supply data, which indicate that our protein from meat, poultry, fish, and eggs  “climbed” by just half an ounce?

In fact, there’s a fairly convenient balance between the calories from red meat that left the supply chain and the calories of chicken that replaced them. It seems we tried to get our animal protein from the sources that the Dietary Goals said were “healthier” for us.

#5 – Did we reduce our consumption of full-fat milk? Yes. And for those folks who contend this means we just started eating more cheese, well, it seems that’s pretty much what we did. However, overall decreases in milk consumption meant that overall calories from dairy fat went down.

#6 – Did we reduce our consumption of foods high in cholesterol? Yes, we did that too. Egg consumption had been declining since the relative affluence of post-war America made meat more affordable and as cholesterol fears began percolating through the scientific and medical community, but it continued to decline after the 1977 Goals.

#7 – Salt? No, we really haven’t changed our salt consumption much and perhaps that’s a good thing. But the connections between salt, calorie intake, and obesity are speculative at best and I’m not going to get into them here (although I do kinda get into them over here).

food supply and Dietary GoalsWhat I see when I look at the data is a good faith effort on the part of the American people to try to consume more of the foods they were told were “healthy,” such as grains and cereals, lean meat, and vegetable oils. We also tried to avoid the foods that we were told contained saturated fat—red meat, eggs, butter, full-fat milk—as these foods had been designated as particularly “unhealthy.” No, we didn’t reduce our sweetener consumption, but grains and cereals have added nearly 5 times more calories than sweeteners have to our food supply/intake.

Although the America of 1970 is more like the America of today than the America of 1909, some things have changed. Probably the most dramatic change between the America of the 1970s and the America of today is our food-health system. Women in the workplace, more suburban sprawl, changing demographics, increases in TV and other screen time—those were all changes that had been in the works for a long time before the 1977 Dietary Goals came along. But the idea that meat and eggs were “bad” for you? That was revolutionary.

And the rapid rises in obesity and chronic diseases that accompanied these changes? Those were pretty revolutionary as well.

One of my favorite things to luck upon on a Saturday morning in the 70s—aside from the Bugs Bunny-does-Wagner cartoon, “What’s Opera, Doc?“—were the public service announcements featuring Timer, an amorphous yellow blob with some sing-along information about nutrition:

You are what you eat

From your head down to your feet

Thinks like meat and eggs and fish you

Need to build up muscle tissue

Hello appetite control?

More protein!

Meat and eggs weren’t bad for you. They didn’t cause heart disease. You needed them to build up muscle tissue and to keep you from being hungry!

But in 1984, when this showed up on the cover of Time magazine (no relation to Timer the amorphous blob), I—along with a lot of other Americans—was forced to reconsider what I’d learned on those Saturday morning not that long ago:

My all-time favorite Timer PSA was this one:

When my get up and go has got up and went,

I hanker for a hunk of cheese.

When I’m dancing a hoedown

And my boots kinda slow down,

Or any time I’m weak in the knees . . .

I hanker for a hunk of

A slab or slice or chunk of–

A snack that is a winner

And yet won’t spoil my dinner–

I hanker for hunk of CHEESE!

In the 80s, when I took up my low-fat, vegetarian ways, I would still hanker for a hunk of cheese, but now I would look for low-fat, skim, or fat-free versions—or feel guilty about indulging in the full-fat versions that I still loved.

I’m no apologist for the food industry; such a dramatic change in our notions about “healthy food” clearly required some help from them, and they appear to have provided it in abundance.  And I’m not a fan of sugar-sweetened beverages or added sweeteners in general, but dumping the blame for our current health crisis primarily on caloric sweeteners is not only not supported by the data at hand, it frames the conversation in a way that works to the advantage of the food industry and gives our public health officials a “get out of jail free card”  for providing 35 years worth of lousy dietary guidance.

Next time on As the Calorie Churns, we’ll explore some of the interaction between consumers, industry, and public health nutrition recommendations. Stay tuned for the next episode, when you’ll get to hear Adele say: “Pollanomics: An approach to food economics that is sort of like the Field of Dreams—only with taco-flavored Doritos.”

The NaCl Debacle Part 2: We don’t need no stinkin’ science!

Sodium-Slashing Superheroes Low-Sodium Larry and his bodacious side-kick Linda “The Less Salt the Better” Van Horn team up to protect Americans from the evils lurking in a teaspoon of salt!
(Drawings courtesy of Butcher Billy)

Yesterday, we found our Sodium-Slashing Superheroes Larry and Linda determined to make sure that no American endangered his/her health by ingesting more than ¾ of a teaspoon of salt a day. But recently, an Institute of Medicine report determined that recommendations to reduce sodium intake to such low levels provided no health benefits and could be detrimental to the health of some people. [In case you missed it and your job is really boring, you can read Part 1 of the NaCl Debacle here.]

Our story picks up as the 2010 USDA/HHS Dietary Guidelines Advisory Committee, fearlessly led by Linda and Larry, arrives at the foregone conclusion that most, if not all, US adults would (somehow) benefit from reducing their sodium intake to 1500 mg/day.  The American Heart Association, in a report written by—surprise!—Larry and Linda, goes on to state that “The health benefits [of reducing sodium intake to 1500 mg/day] apply to Americans in all groups, and there is no compelling evidence to exempt special populations from this public health recommendation.”

Does that mean there is “compelling evidence” to include special populations, or for that matter ordinary populations, in this 1500 mg/day recommendation? No, but who cares?

Does that mean there is science to prove that “excess” sodium intake (i.e. more than ¾ of a teaspoon of salt a day) leads to high blood pressure and thus cardiovascular disease, or that salt makes you fat, or that sodium consumption will eventually lead to the zombie apocalypse? No, no, and no—but who cares?

Larry and Linda KNOW that salt is BAD. Science? They don’t need no stinkin’ science.

Because the one thing everyone seems to be able to agree on is that the science on salt does indeed stink. The IOM report has had to use many of the same methodologically-flawed studies available to the 2010 Dietary Guidelines Advisory Committee, full of the same confounding, measurement error, reverse causation and lame-ass dietary assessment that we know and love about all nutrition epidemiology studies.  But the 2010 Dietary Guidelines Advisory Committee didn’t actually bother to look at these studies.

Why not?  (And let me remind you that the Dietary Guidelines folks usually <heart> methodologically-flawed study designs, full of confounding, measurement error, reverse causation and lame-ass dietary assessment.)

First, a little lesson in how the USDA/HHS folks create dietary guidance meant to improve the health and well-being of the American people:

  1. Take a clinical marker, whose health implications are unclear, but whose levels we can measure cheaply and easily (like blood pressure, cholesterol, weight).
  2. Suggest that this marker—like Karnac the Magnificent—can somehow predict risk of a chronic disease whose origins are multiple and murky (like obesity, heart disease, cancer).
  3. Use this suggestion to establish some arbitrary clinical cut offs for when this marker is “good” and “bad.” (Note to public health advocacy organizations: Be sure to frequently move those goalposts in whichever direction requires more pharmaceuticals to be purchased from the companies that sponsor you.)
  4. Find some dietary factor that can easily and profitably be removed from our food supply, but whose intake is difficult to track (like saturated fat, sodium, calories).
  5. Implicate the chosen food factor in the regulation of the arbitrary marker, the details of which we don’t quite understand. (How? Use observational data—see methodological flaws above—but hunches and wild guesses will also work.)
  6. Create policy that insists that the entire population—including people who, by the way, are not (at least at this point) fat, sick or dead—attempt to prevent this chronic disease by avoiding this particular dietary factor. (Note to public health advocacy organizations: Be sure to offer food manufacturers the opportunity to have the food products from which they have removed the offensive component labeled with a special logo from your organization—for a “small administrative fee,” of course.)
  7. Commence collecting weak, inconclusive, and inconsistent data to prove that yes indeedy this dietary factor we can’t accurately measure does in fact have some relationship to this arbitrary clinical marker, whose regulation and health implications we don’t fully understand.
  8. Finally—here’s the kicker—measure the success of your intervention by whether or not people are willing to eat expensive, tasteless, chemical-filled food devoid of the chosen food factor in order to attempt to regulate the arbitrary clinical marker.
  9. Whatever you do, DO NOT EVER measure the success of your intervention by looking at whether or not attempts to follow your intervention has made people fat, sick, or dead in the process.
  10. Ooops. I think I just described the entire history of nutrition epidemiology of chronic disease.

Blood pressure is easy to measure, but we don’t always know what causes it to go up (or down). There is no real physiological difference between having a blood pressure reading of 120/80, which will get you a diagnosis of “pre-hypertension” and a fistful of prescriptions, and a reading of 119/79, which won’t.  Blood pressure is not considered to be a “distinct underlying cause of death,” which means that, technically, no one ever dies of blood pressure (high or low). We certainly don’t know how to disentangle the effects of lowering dietary sodium on blood pressure from other effects (like weight loss) that may be related to dietary changes that are a part of an attempt to lower sodium (and we have an embarrassingly hard time collecting accurate dietary intake information from Food Fantasy Questionnaires anyway). We also know that individual response to sodium varies widely.

So doesn’t it make perfect sense that the folks at the USDA/HHS should ignore science that investigates the relationship between sodium intake and whether or not a person stayed out of the hospital, had a heart attack, or up and died? Well, it doesn’t to me, but nevertheless the USDA/HHS has remained obsessively fixated on one thing and one thing only, what effects reducing sodium has on blood pressure,  and they pay not one whit of attention to what effects reducing sodium has on, say, aliveness.

So let’s just get this out there and agree to agree: reducing sodium in most cases will reduce blood pressure.  But then, just to be clear, so will dismemberment, dysentery, and death.  We can’t just assume that lowering sodium will only affect blood pressure or will only positively affect health (I mean, we can’t unless we are Larry or Linda). Recent research, which prompted the IOM review, indicates that reducing sodium will also increase triglyceride levels, insulin resistance, and sympathetic nervous system activity. For the record, clinicians generally don’t consider these to be good things.

This may sound radical but in their review of the evidence, the IOM committee decided to do a few things differently.

First, they gave more weight to studies that determined sodium intake levels through multiple high-quality 24-hour urine collections. Remember, this is Low-Sodium Larry’s favorite way of estimating intake.

Also, they did not approach the data with a predetermined “healthy” range already established in their brains. Because of the extreme variability in intake levels among population groups, they decided to—this is crazy, I know—let the outcomes speak for themselves.

Finally, and most importantly, in the new IOM report, the authors, unlike Larry and Linda, focused on—hold on to your hats, folks!—actual health outcomes, something the Dietary Guidelines Have. Never. Done. Ever.

The IOM committee found, in a nutshell:

“that evidence from studies on direct health outcomes is inconsistent and insufficient to conclude that lowering sodium intakes below 2,300 mg per day either increases or decreases risk of CVD outcomes (including stroke and CVD mortality) or all-cause mortality in the general U.S. population.”

In other words, there is no science to indicate that we all need to be consuming less than ¾ of a teaspoon of salt a day. Furthermore, while there may be some subpopulations that may benefit from sodium reduction, reducing sodium intake to 1500 mg/day may increase risk of adverse health outcomes for people with congestive heart failure, diabetes, chronic kidney disease, or heart disease. (If you’d like to wallow in some of the studies reviewed by the IOM, I’ve provided the Reader’s Digest Condensed Version at the bottom of the page.)

Of course, the American Heart Association, eager to provide the public with the most up-to-date recommendations about heart health as long as they don’t contradict outdated recommendations of which the AHA is fond, responded to the IOM report by saying, “The American Heart Association is not changing its position. The association rejects the Institute of Medicine’s conclusions because the studies on which they were based had methodological flaws.”

Um, hello AHA? Exactly what completely non-existent, massive, highly-controlled and yet highly-generalizable randomized controlled trials about sodium intake and health effects were you planning on using to make your case? I believe it was the AHA that mentioned that “It is well-known, however, that such trials are not feasible because of logistic, financial, and often ethical considerations.” Besides, I don’t know what the AHA is whining about. The quality of the science hardly matters if you are not going to pay any attention to it in the first place.

No, folks that giant smacking sound you hear is not my head on my keyboard. That was the sound of science crashing into a giant wall of Consistent Public Health Message. Apparently, those public health advocates at the AHA seem to think that changing public health messages—even when they are wrong—confuses widdle ol’ Americans. The AHA—and the USDA/HHS team—doesn’t want us to have to worry our pretty little heads about all that crazy scientifical stuff with big scary words and no funny pictures or halftime shows.

Frankly, I appreciate that. I hate to have my pretty little head worried. But there’s one other problem with this particular Consistent Public Health Message. Not only is there no science to back it up; not only is it likely to be downright detrimental to the health of certain groups of people; not only is it likely to introduce an arsenal of synthetic chemical salt-replacements that will be consumed at unprecedented levels without testing for negative interactions or toxicities (remember how well that worked out when we replaced saturated fat with partially-hydrogenated vegetable oils?)—it is, apparently, incompatible with eating food.

Researchers set out to find what would really happen if Americans were muddle-headed and sheep-like enough to actually try to reduce their sodium intake to 1500 mg/day. They discovered that, “the 2010 Dietary Guidelines for sodium were incompatible with potassium guidelines and with nutritionally adequate diets, even after reducing the sodium content of all US foods by 10%.”  Way to go, Guidelines

While these researchers suggested that a feasibility study (this is a scientifical term for “reality check”) should precede the issuing of dietary guidelines to the public, I have a different suggestion.

How about we just stop with the whole 30-year-long dietary experiment to prevent chronic disease by telling Americans what not to eat? I hate to be the one to point this out, but it doesn’t seem to be working out all that well.  It’s hard to keep assuming that the AHA and the USDA/HHS mean well when, if you look at it for what it is, they are willing to continue to jeopardize the health of Americans just so they don’t have to admit that they might have been wrong about a few things.  I suppose if a Consistent Public Health Message means anything, it means never having to say you’re sorry for 30 years-worth of lousy dietary advice.

Marion Nestle has noted that, up until now, “every single committee that has dealt with this question [of sodium-reduction] says, ‘We really need to lower the sodium in the food supply.’ Now either every single committee that has ever dealt with this issue is delusional, which I find hard to believe—I mean they can’t all be making this up—[or] there must be a clinical or rational basis for the unanimity of these decisions.”

Weeeell, I got some bad news for you, Marion. Believe it. They have been delusional. They are making this up. And no, apparently there is no clinical or rational basis for the unanimity of these decisions.

But, thanks to the IOM report, perhaps we can no longer consider these decisions to be unanimous.

Praise the lard and pass the salt.

Read ’em and weep:  The Reader’s Digest Condensed Version of the science from the IOM report.  Studies marked with an asterix (*) are studies that were available to the 2010 Dietary Guidelines Advisory Committee.  

Studies that looked at Cardiovascular Disease, Stroke, and Mortality

*Cohen et al. (2006)

When intakes of sodium less than 2300 mg per day were compared to intakes greater than 2300 mg per day, the “lower sodium intake was statistically significantly associated with increased risk of all-cause mortality.”

*Cohen et al. (2008)

When a fully-adjusted (for confounders) model was used, “there was a statistically significant higher risk of CVD mortality with the lowest vs. the highest quartile of sodium intake.”

Gardener et al. (2012)

Risk of stroke was positively related to sodium intake when comparing the highest levels of intake to the lowest levels of intake. There was no statistically significant increase in risk for those consuming between 1500 and 4000 mg of sodium per day.

*Larsson et al. (2008)

“The analyses found no significant association between dietary sodium intake and risk of any stroke subtype.”

*Nagata et al. (2004)

“Among men, a 2.3-fold increased risk of stroke mortality was associated with the highest tertile of sodium intake.” That sounds bad, but the average sodium intake in the high-risk group was 6613 mg per day. The lowest risk group had an average intake of 4070 mg per day. “Thus, the average sodium intake in the US would be within the lowest tertile of this study.”

Stolarz-Skrzypek at al. (2011)

“Overall, the authors found that lower sodium intake was associated with higher CVD mortality.”

Takachi et al. (2010)

The authors found “a significant positive association between sodium consumption at the highest compared to the lowest quintile and risk of stroke.” As with the Nagata (2004) study, this sounds bad, but the average sodium intake in the high-risk group was 6844 mg per day. The lowest risk group had an average intake of 3084 mg per day. “Thus, the average sodium intake in the US would be close to the lowest quintile of this study.”

*Umesawa et al. (2008)

“The authors found an association between greater dietary sodium intake and greater mortality from total stroke, ischemic stroke, and total CVD.” However, as with the Nagata and the Takchi studies (above), lower quintiles—in this case, quintiles one and two—would be comparable to average US intake.

Yang et al. (2011)

Higher usual sodium intake was found to be associated with all-cause mortality, but not cardiovascular disease mortality or ischemic heart disease mortality. “However, the finding that correction for regression dilution increased the effect on all-cause mortality, but not on CVD mortality, is inconsistent with the theoretical causal pathway.”  In other words, high sodium intake might be bad for health, but not because it raises blood pressure and leads to heart disease.

Studies in Populations 51 Years of Age or Older

*Geleijnse et al. (2007)

“This study found no significant difference between urinary sodium level and risk of CVD mortality or all-cause mortality.” Relative risk was lowest in the medium intake group, with an average estimated intake of 2, 415 mg/day.

Other

“Five of the nine reported studies in the general population listed above also analyzed the data on health outcomes by age and found no interaction (Cohen et al., 2006, 2008; Cook et al., 2007; Gardener et al., 2012; Yang et al., 2011).”

Studies in Populations with Chronic Kidney Disease

Dong et al. (2010)

“The authors found that the lowest sodium intake was associated with increased mortality risk.”

Heerspink et al. (2012)

“Results from this study suggest that ARBs were more effective at decreasing CKD progression and CVD when sodium intake was in the lowest tertile” which had an estimated average sodium intake of about 2783 mg/day.

Studies on Populations with Cardiovascular Disease

Costa et al. (2012)

“Dietary sodium intake was estimated from a 62-itemvalidated FFQ. . . . Significant correlations were found between sodium intake and percentage of fat and calories in daily intake. . . . Overall, for the first 30 days and up to 4 years afterward, total mortality was significantly associated with high sodium intake.”

Kono et al. (2011)

“Cumulative risk analysis found that a salt intake of greater than the median of 4,000 mg of sodium) was associated with higher stroke recurrence rate. Univariate analysis of lifestyle management also found that poor lifestyle, defined by both high salt intake and low physical activity, was significantly associated with stroke recurrence.

O’Donnell et al. (2011)

“For the composite outcome, multivariate analysis found a U-shaped relationship between 24-hour urine sodium and the composite outcome of CVD death, MI, stroke, and hospitalization for CHF.” In other words, both higher (>7,000 mg per day estimated intake) and lower (<2,990 mg per day estimated intake) intakes of sodium were associated with increased risk of heart disease and mortality.

Studies on Populations with Prehypertension

*Cook et al. (2007)

In a randomized trial comparing a low sodium intervention with usual intake, lower sodium intake did not significantly decrease risk of mortality or heart disease events.

*Cook et al. (2009)

No significant increase in risk of adverse cardiovascular outcomes was associated with increased sodium excretions levels.

Other

“Several other studies discussed in this chapter analyzed data on health outcomes by blood pressure and found no statistical interactions (Cohen et al., 2006, 2008; Gardener et al., 2012; O’Donnell et al., 2011; Yang et al., 2011).”

Studies on Populations with Diabetes

Ekinci et al. (2011)

Higher sodium intakes were associated with decreased risk of all-cause mortality and heart disease mortality.

Tikellis et al. (2013)

“Adjusted multivariate regression analysis found urinary sodium excretion was associated with incident CVD, with increased risk at both the highest [> 4,401 mg/day] and lowest [<2,346 mg/day] urine sodium excretion levels. When analyzed as independent outcomes, no significant associations were found between urinary sodium excretion and new CVD or stroke after adjustment for other risk factors.”

Other

“Two other studies discussed in this chapter analyzed the data on health outcomes by diabetes prevalence and found no interaction (Cohen et al., 2006; O’Donnell et al., 2011).”

Studies in Populations with Congestive Heart Failure

Arcand et al. (2011)

High sodium intake levels (≥2,800 mg per day) were significantly associated with acute decompensated heart failure, all-cause hospitalization, and mortality.

Lennie et al. (2011)

“Results for event-free survival at a urinary sodium of ≥3,000 mg per day varied by the severity of patient symptoms.” In people with less severe symptoms, sodium intake greater than 3,000 mg per day was correlated with a lower disease incidence compared to those with a sodium intake less than 3,000 mg per day. Conversely, people with more severe symptoms who had a sodium intake greater than 3,000 mg per day had a higher disease incidence than those with sodium intakes less than 3,000 mg per day.

Parrinello et al. (2009)

“During the 12 months of follow-up, participants receiving the restricted sodium diet [1840 mg/day] had a greater number of hospital readmissions and higher mortality compared to those on the modestly restricted diet [2760 mg/day].”

*Paterna et al. (2008)

The lower sodium intake group [1840 mg/day] experienced a significantly higher number of hospital readmissions compared to the normal sodium intake group [2760 mg/day].

*Paterna et al. (2009)

A significant association was found between the low sodium intake [1,840 mg per day]) and hospital readmissions. The group with normal sodium diet [2760 mg/day] also had fewer deaths compared to all groups receiving a low-sodium diet combined.

The NaCl Debacle Part 1: Salt makes you fat?

Don’t look now, but I think the Institute of Medicine’s new report on sodium just bitch-slapped the USDA/HHS 2010 Dietary Guidelines.

In case you have a life outside of the nutritional recommendation roller derby, the IOM recently released a report that comes to the conclusion that restricting sodium intake to 1500 mg/day may increase rather than reduce health risks. Which is a little weird, since the 2010 Dietary Guidelines did a great job of insisting that any American with high blood pressure, all blacks, and every middle-aged and older adult—plus anyone who has ever eaten bacon or even thought about eating bacon, i.e. nearly everybody—should limit their salt intake to 1500 mg of sodium a day, or less than ¾ of a teaspoon of salt. The American Heart Association was, of course, aghast. The AHA thinks EVERYBODY should be limited to less than ¾ teaspoon of salt a day, including people who wouldn’t even think about thinking about bacon.

Why are the AHA and USDA/HHS so freaked out about salt?  And how did the IOM reach such a vastly different conclusion than that promoted by the AHA and the Dietary Guidelines?  Fasten your seat belts folks, it’s gonna be a bumpy blog.

First, it is helpful to examine why the folks at AHA and USDA/HHS are so down on salt.  The truth: we have no freakin’ idea. Salt has been around since what, the dawn of civilization maybe? It is an essential nutrient, and it plays an important role in preserving food and preventing microbial growth (especially on bacon). But Americans could still be getting too much of a good thing. Everybody at the AHA seems to think that Americans consume “excessive amounts” of sodium. (Of course, just about anything looks excessive compared to less than ¾ of a teaspoon.) But do we really consume too much sodium?

Back in 2010, Dr. Laurence I-Know-More-About-Sodium-Than-Your-Kidneys-Do Appel (or as his friends call him, “Low-Sodium Larry”), one of the leading advocates for a salt-free universe, acknowledged that “The data is quite murky. We just don’t have great data on sodium trends over time. I wish that we did. But I can’t tell you if there’s been an increase or decrease.”

Well, Low-Sodium Larry, I can, and I am about to make your wish come true.

According to recent research done by that wild bunch of scientific renegades at Harvard, in the past 60 years sodium intake levels have . . .drumroll, please . . .  not done much of anything.

Hey, that doesn’t sound right! Everyone knows that it is virtually impossible to get an accurate measure of sodium intake from dietary questionnaires; people are probably just “under-reporting” their salt intake like they “under-report” everything else. Low-Sodium Larry has previously insisted that one of the reasons the data is so murky is that few epidemiological studies measure sodium intake accurately and that, “really, you should do 24-hour urinary sodium excretions to do it right.”

The guys at Harvard looked at studies that did it right.  This systematic analysis of 38 studies from the 1950s to the present, found that 24-hour urinary sodium excretion (the “gold” standard—omg, I could not resist that—of dietary sodium intake estimation) has neither increased nor decreased, but has remained essential stable over time. Despite the fact that Americans are apparently hoovering up salt like Kim Kardashian hoovers up French fries—and with much the same results, i.e. puffing up like a Macy’s Thanksgiving Day balloon—for whatever reason we simply aren’t excreting more of it in our urine.

According to that same study however, despite the lack of increase in sodium excretion (which is supposed to accurately reflect intake—but that can’t be right), high blood pressure rates in the population have been increasing. Duh. Everyone knows that eating lots of salt makes your blood pressure go up. But have the rates of high blood pressure in America really been going up?

Age-Adjusted Prevalence of Hypertension (2009 NIH Chart Book)

Well, no.  Not really. The Harvard dudes cite a report that goes back to 1988-1994 data, and yes, rates of high blood pressure have been creeping slowly back up since then. This is because from 1976-1980 to 1988-1994, rates of high blood pressure plummeted for most segments of the American population.

We don’t know why rates of high blood pressure fell during the 70s and early 80s. It may have been that the Dietary Guidelines told people to eat more potassium-filled veggies and people actually tried to follow the Dietary Guidelines, which would have had a positive effect on high blood pressure. On the other hand, it could have been largely due to the sedating influence of the soft rock music of that era blanketing the airwaves with the mellow tones of England Dan and John Ford Coley, Christopher Cross, Ambrosia, and the like (youtube it, you young whippersnappers out there). We also don’t know why rates are going back up. Rising rates of obesity may be part of the problem, but it is also entirely possible that piping the Monsters of Lite Rock through every PA system in the country might save our health care system a lot of time and trouble.

This is what we (think we) know:

  • High-sodium diets might possibly maybe sometimes be a contributor to high blood pressure.
  • Rates of high blood pressure are going (back) up.
  • Obesity rates are definitely going up.

Ergo pro facto summa cum laude, it is clear—using the logic that seems to undergird the vast majority of our public health nutrition recommendations—salt makes you fat.  The USDA/HHS has been faced with rapidly rising rates of obesity which, until now, they have only been to pin on the laziness and gluttony of Americans.  But if salt makes us fat, that might explain why the USDA/HHS doesn’t want us to eat it.

After all, the biomechanics of this is pretty straightforward. If you eat too much sodium (which we must be), but you don’t pee it out (which we aren’t), you must be retaining it and this is what makes your blood pressure and your weight both go way up. They didn’t really cover the physics of this in my biochemistry classes so you’ll have to ask Dr. Appel how this works because he knows more about sodium than your kidneys do. But I think it must be true. After all, this is the mechanism that explains the weight loss behind carbohydrate-reduced diets, right? I myself reduced my carb intake and lost 60 pounds of water weight!

And besides, taking the salt out of our food will give food manufacturers the opportunity to make food more expensive and tasteless while adding synthetic ingredients whose long-term effects are unknown—just what the American consumer wants!

For a while there, we thought the whole idea was to reduce sodium in order to reduce blood pressure in order to reduce diseases of the circulatory system, like heart failure, stroke, and coronary heart disease . That didn’t seem to work out so well, because the whole time that sodium intake was staying stable (if we want to believe the urinary sodium excretion data) and high blood pressure rates were going down (although they are starting to go back up), rates of those diseases have gone up:

Age-Adjusted Prevalence of Heart Failure (2009 NIH Chart Book)

Age-Adjusted Prevalence of Stroke (2009 NIH Chart Book)

Age-Adjusted Prevalence of Coronary Heart Disease (2007 NIH Chart Book)

So if reducing blood pressure to reduce cardiovascular disease isn’t the answer, then we must need to reduce blood pressure to reduce obesity! By jove, I think we’ve got it!

The USDA/HHS must have known the “salt makes you fat” notion would be a tough sell, I mean, what with the lack of any shred of supporting science and all that. (But then, the “salt causes high blood pressure which causes cardiovascular disease” argument hasn’t exactly been overburdened by evidence either, and that never seemed to stop anyone.) So the 2010 Dietary Guidelines brought together the American Heart Association’s Superheroes of Sodium Slashing, Low-Sodium Larry and his bodacious salt-subduing sidekick, Linda Van Horn, both of whom had been preaching the gospel of sodium-reduction as a preventive health measure with little conclusive evidence to support their recommendations.  The USDA/HHS knew that with Linda and Larry on the team, it didn’t matter how lame the science, how limited the data, or how ludicrous the recommendation, these two could be counted on to review any and all available evidence and reliably come up with the exact same concrete and well-proven assumptions they’d been coming up with for years.

The Sodium-Slashing Superheroes–Drs. Lawrence Appel and Linda Van Horn– ready to make the world safe for bland, unappetizing food everywhere! (Drawings courtesy of Butcher Billy)

So here’s the cliffhanger:  Will Linda and Larry be able to torture the science on salt into confessing its true role in the obesity crisis?

Tune in tomorrow, when you’ll hear Linda and Larry say: “Science? We don’t need no stinkin’ science.”

Not Just Science: How nutrition got stuck in the past

Nostalgia for a misremembered past is no basis for governing a diverse and advancing nation.

David Frum

The truth is that I get most of my political insight from Mad Magazine; they offer the most balanced commentary by far. However, I’ve been very interested in the fallout from the recent election, much more so than I was in the election itself; it’s like watching a Britney Spears meltdown, only with power ties. I kept hearing the phrase “epistemic closure” and finally had to look it up. Now, whether or not the Republican party suffers from it, I don’t care (and won’t bother arguing about), but it undeniably describes the current state of nutrition. “Epistemic closure” refers to a type of close-mindedness that precludes any questioning of the prevailing dogma to the extent that the experts, leaders, and pundits of a particular paradigm:

“become worryingly untethered from reality”

“develop a distorted sense of priorities”

and are “voluntarily putting themselves in the same cocoon”

Forget about the Republicans. Does this not perfectly describe the public health leaders that are still clinging blindly to the past 35 years of nutritional policy?  The folks at USDA/HHS live in their own little bubble, listening only to their own experts, pretending that the world they live in now can be returned to an imaginary 1970s America, where children frolicked outside after downing a hearty breakfast of sugarless oat cereal and grown-ups walked to their physically-demanding jobs toting homemade lunches of hearty rye bread and shiny red apples.

Remember when all the families in America got their exercise playing outside together—including mom, dad, and the maid? Yeah, me neither.

So let me rephrase David Frum’s quote above for my own purposes: Nostalgia for a misremembered past is no basis for feeding a diverse and advancing nation.

If you listen to USDA/HHS, our current dietary recommendations are a culmination of science built over the past 35 years on the solid foundation of scientific certainty translated into public health policy. But this misremembered scientific certainty wasn’t there then and it isn’t here now; the early supporters of the Guidelines were very aware that they had not convinced the scientific community that they had a preponderance of evidence behind them [1]. Enter the first bit of mommy-state* government overreach. When George McGovern’s (D) Senate Select Committee came up with the 1977 Dietary Goals for Americans, it was a well-meaning approach to not only reduce chronic disease, a clear public health concern, but to return us all to a more “natural” way of eating. This last bit of ideology reflected a secular trend manifested in the form of the Dean Ornish-friendly Diet for a Small Planet, a vegetarian cookbook that smushed the humanitarian and environmental concerns of meat-eating in with some flimsy nutritional considerations, promising that a plant-based diet was the best way to feed the hungry, save the planet, safeguard your health, and usher in the Age of Aquarius.  This was a pop culture warm-fuzzy with which the “traditional emphasis on the biochemistry of disease” could not compete [2].

If you listen to some folks, the goofy low-fat, high-carb, calories in-calories out approach can be blamed entirely on this attempt of the Democrats to institutionalize food morality. But, let’s not forget that the stage for the Dietary Guidelines fiasco was set earlier by Secretary of Agriculture Earl Butz, an economist with many ties to large agricultural corporations who was appointed by a Republican president. He initiated the “fencerow to fencerow” policies that would start the shift of farm animals from pastureland to feed lots, increasing the efficiency of food production because what corn didn’t go into cows could go into humans, including the oils that were a by-product of turning crops into animal feed. [Update: Actually, not so much Butz’s fault, as I’ve come to learn, because so many of these policies were already in place before he came along. Excellent article on this here.]

When Giant Agribusiness—they’re not stupid, y’know—figured out that industrialized agriculture had just gotten fairydusted with tree-hugging liberalism in the form of the USDA Guidelines, they must have been wetting their collective panties. The oil-refining process became an end in itself for the food industry, supported by the notion that polyunsaturated fats from plants were better for you than saturated fats from animals, even though evidence for this began to appear only after the Guidelines were already created and only through the status quo-confirming channels of nutrition epidemiology, a field anchored solidly in the crimson halls of Harvard by Walter Willett himself.

Between Earl Butz and McGovern’s “barefoot boys of nutrition,” somehow corn oil from refineries like this became more “natural” than the fat that comes, well, naturally, from animals.

And here we are, 35 years later, trying to untie a Gordian knot of weak science and powerful industry cemented together by the mutual embarrassment of both political orientations. The entrenched liberal ivory-tower interests don’t want look stupid by having to admit that the 3 decades of public health policy they created and have tried to enforce have failed miserably. The entrenched big-business-supporting conservative interests don’t want to look stupid by having to admit that Giant Agribusiness, whose welfare they protect, is now driving up government spending on healthcare by acting like the cigarette industry did in the past and for much the same reasons.

These overlapping/competing agendas have created the schizophrenic, conjoined twins of a food industry-vegatarian coalition, draped together in the authority of government policy. Here the vegans (who generally seem to be politically liberal rather than conservative, although I’m sure there are exceptions) play the part of a vocal minority of food fundamentalists whose ideology brooks no compromise. (I will defend eternally the right for a vegan–or any fundamentalist–to choose his/her own way of life; I draw the line at having it imposed on anyone else–and I squirm a great deal if someone asks me if that includes children.)  The extent to which vegan ideology and USDA/HHS ideology overlap has got to be a strange bedfellow moment for each, but there’s no doubt that the USDA/HHS’s endorsement of vegan diets is a coup for both. USDA/HHS earns a politically-correct gold star for their true constituents in the academic-scientific-industrial complex, and vegans get the nutritional stamp of approval for a way of eating that, until recently, was considered by nutritionists to be inadequate, especially for children.

Like this chicken, the USDA/HHS loves vegans—at least enough to endorse vegan diets as a “healthy eating pattern.”

But if the current alternative nutrition movement is allegedly representing the disenfranchised eaters all over America who have been left out of this bizarre coalition, let us remember that, in many ways, the “alternative” is really just more of the same. If the McGovern hippies gave us “eat more grains and cereals, less meat and fat,” now the Republican/Libertarian-leaning low-carb/primaleo folks have the same idea only the other way around—and with the same justification.  “Eat more meat and fat, fewer grains and cereals;” it’s a more “natural” way to eat.

As counterparts to the fundamentalist vegans, we have the Occupy Wall street folks of the alternative nutrition community—raw meaters who sleep on the floor of their caves and squat over their compost toilets after chi running in their Vibrams. They’re adorably sincere, if a little grubby, and they have no clue how badly all the notions they cherish would get beaten in a fight with the reality of middle-Americans trying to make it to a PTA meeting.

How paleo might look from the outside.

To paraphrase David Frum again, the way forward in food-health reform is collaborative work, and although we all have our own dietary beliefs, food preferences, and lifestyle idiosyncrasies, the immediate need is for a plan with just this one goal: we must emancipate ourselves from prior mistakes and adapt to contemporary realities.

Because the world in which we live is not the Brady Bunch world that the many of us in nutrition seem to think it is.

Frum makes the point that in 1980, when the Dietary Guidelines were first officially issued from the USDA, this was still an overwhelmingly white country. “Today, a majority of the population under age 18 traces its origins to Latin America, Africa, or Asia. Back then, America remained a relatively young country, with a median age of exactly 30 years. Today, over-80 is the fastest-growing age cohort, and the median age has surpassed 37.” Yet our nutrition recommendations have not changed from those originally created on a weak science base of studies done on middle-aged white people. To this day, we continue to make nutrition policy decisions on outcomes found in databases that are 97% white. The food-health needs of our country are far more diverse now, culturally and biologically. And another top-down, one-size-fits-all approach from the alternative nutrition community won’t address that issue any more adequately than the current USDA/HHS one.

For those who think the answer is to “just eat real food,” here’s another reality check: “In 1980, young women had only just recently entered the workforce in large numbers. Today, our leading labor-market worry is the number of young men who are exiting.” That means that unless these guys are exiting the workforce to go home and cook dinner, the idea that the solution to our obesity crisis lies in someone in each American household willingly taking up the mind-numbingly repetitive and eternally thankless chore of putting “real food” on the table for the folks at home 1 or more times a day for years on end—well, it’s as much a fantasy as Karl Rove’s Ohio outcome.

David Frum points out that “In 1980, our top environmental concerns involved risks to the health of individual human beings. Today, after 30 years of progress toward cleaner air and water, we must now worry about the health of the whole planetary climate system.” Today, our people and our environment are both sicker than ever. We can point our fingers at meat-eaters, but saying we now grow industrialized crops in order to feed them to livestock is like saying we drill for oil to make Vaseline. The fact that we can use the byproducts of oil extraction to make other things—like Vaseline or livestock feed—is a happy value-added efficiency in the system, no longer its raison d’etre. Concentrated vertical integration has undermined the once-proud tradition of land stewardship in farming. Giving this power back to farmers means taking some power away from Giant Agribusiness, and neither party has the political will to do that, especially when together they can demonize  livestock-eating while promoting corn oil refineries.

If we all just stopped eating meat, then we wouldn’t have to plant so much corn, right? Right?

And it’s not just our food system that has changed: “In 1980, 79 percent of Americans under age 65 were covered by employer-provided health-insurance plans, a level that had held constant since the mid-1960s. Back then, health-care costs accounted for only about one 10th of the federal budget. Since 1980, private health coverage has shriveled, leaving some 45 million people uninsured. Health care now consumes one quarter of all federal dollars, rapidly rising toward one third—and that’s without considering the costs of Obamacare.”  That the plant-based diet that was institutionalized by liberal forces and industrialized by conservative ones is a primary part of this enormous rise in healthcare costs is something no one on either side of the table wants to examine. Diabetes—the symptoms of which are fairly easily reversed by a diet that excludes most industrialized food products and focuses on meat, eggs, and veggies—is the nightmare in the closet of both political ideologies.

David Frum quotes the warning from  British conservative, the Marquess of Salisbury, “The commonest error in politics is sticking to the carcass of dead policies.”

Right now, it is in the best interest of both parties to stick to our dead nutrition policies and dump the ultimate blame on the individuals (we gave you sidewalks and vegetable stands–and you’re still fat! cry the Democrats; we let the food industry have free reign so you could make your own food choices–and you’re still fat! cry the Republicans). It’s a powerful coalition, resistant to change no matter who is in control of the White House or Congress.

What can be done about it, if anything? To paraphrase Frum once again, a 21st century food-health system must be economically inclusive, environmentally responsible, culturally modern, and intellectually credible.

We can start the process by stopping with the finger-pointing and blame game, shedding our collective delusions about the past and the present, and recognizing the multiplicity of concerns that must be addressed in our current reality. Let’s begin by acknowledging that—for the most part—the people in the spotlight on either side of the nutrition debate don’t represent the folks most affected by federal food-health policies. It is our job as leaders, in any party and for any nutritional paradigm, to represent those folks first, before our own interests, funding streams, pet theories, or personal ideologies. If we don’t, each group—from the vegatarians to folks at Harvard to the primaleos—runs the risk of suffering from its own embarrassing form of epistemic closure.

Let’s quit bickering and get to work.

**********************************************************

*This was too brilliant to leave buried in the comments section:

“Don’t you remember the phrase “wait til your father gets home”? You want to know what the state is? It’s Big Daddy. Doesn’t give a damn about the day to day scut, just swoops in to rescue when things get out of hand and then takes all the credit when the kids turn out well, whether it’s deserved or not. Equates spending money with parenting, too.”–from Dana

So from henceforth, all my “mommy-state” notions are hereby replaced with “Big Daddy,” a more accurate and appropriate metaphor.  And I never metaphor I didn’t like.

References:

1. See Select Committee on Nutrition and Human Needs of the United States Senate. Dietary Goals for the United States. 2nd ed. Washington, DC: US Government Printing Office; 1977b. Dr. Mark Hegsted, Professor of Nutrition at Harvard School of Public Health and an early supporter of the 1977 Goals, acknowledged their lack of scientific support at the press conference announcing their release: “There will undoubtedly be many people who will say we have not proven our point; we have not demonstrated that the dietary modifications we recommend will yield the dividends expected . . . ”

2. Broad, WJ. Jump in Funding Feeds Research on Nutrition. Science, New Series, Vol 204. No. 4397 (June 8, 1979). Pp. 1060-1061 + 1063-1064. In a series of articles in Science in 1979, William Broad details the political drama that allowed the “barefoot boys of nutrition” from McGovern’s committee to put nutrition in the hands of the USDA.

Why Race Doesn’t Matter in Nutrition Policy

This is the first of a series looking at what does and doesn’t matter when it comes to nutrition policy. When I started out on this adventure, I thought that science would give me the answers to the questions I had about why public health and clinical recommendations for nutrition were so limited. Silly me. The science part is easy. But policy, politics, economics, industry, media framing, the scientific bureaucracy, cultural bias—now that stuff is crazy complicated. It’s like an onion: when you start peeling back the layers, you just want to cry. I am also honored to say that this post is part of the Diversity in Science Carnival on Latino / Hispanic Health: Science and Advocacy

When we began investigating relationships between diet and chronic disease, we didn’t pay much attention to race. The longest-running study of the relationship between dietary factors and chronic disease is the Framingham Heart Study, a study made up entirely of white, middle-class participants. Since 1951, the Framingham study has generated over 2 thousand journal articles and retains a central place in the creation of public health nutrition policy recommendations for all Americans.

More recent datasets—especially the large ones—are nearly as demographically skewed.

The Nurses’ Health Study is 97% Caucasian and consists of 122,000 married registered nurses who were between the ages of 30 and 55 when the study began in 1976. An additional 116,686 nurses ages 25 – 42 were added in 1989, but the racial demographics remained unchanged.

The Health Professionals’ Follow-up Study began in 1986, as a complementary dataset to the Nurses’ Health Study. It is 97% Caucasian and consists, as the name suggests, of 51, 529 men who were health professionals, aged 40-75, when the study began.

The Physicians’ Health Study began in 1982, with 29, 071 men between the ages of 40-84. The second phase started in 1997, adding men who were then over 50. Of participants whose race is indicated, 91% are Caucasian, 4.5% are Asian/Pacific Islander, 2% are Hispanic, and less than 1% are African-American or American Indian. I have detailed information about the racial subgroups of this dataset because I had to write the folks at Harvard and ask for them. Race was of such little interest that the racial composition of the participants is never mentioned in the articles generated from this dataset.

Over the years, these three mostly-white datasets have generated more journal articles than five of the more diverse datasets all put together.* These three datasets, all administered by Harvard, have been used to generate some of the more sensationalist nutrition headlines of the past few years–red meat kills, for instance–with virtually no discussion about the fact that the findings apply to a population–mostly white, middle to upper middle class, well-educated, health professionals, most of whom who were born before the atomic bomb–to which most of us do not belong.

Shift in demographics in past 50 years;
predicted shift in next 50 years

Although we did begin to realize that race and other characteristics might actually matter with regard to health (hence the existence of datasets with more diversity), we can’t really fault those early researchers for creating such lopsided datasets. At that point, not only was the US more white than it is now, scientific advances that would reveal more about how our genetic background might affect health had not yet been developed. We had not yet mapped the human genome; epigenetics (the study of the interaction between environmental inputs and the expression of genetic traits) was in its infancy, and biochemical individuality was little more than a glimmer in Roger Williams’ eye.

Socially, culturally, and I think, scientifically, we were all inclined to want to think that everyone was created equal, and this “equality” extended to how our health would be affected by food. Stephen Jay Gould’s 1981 book, The Mismeasure of Man, critiqued the notion that “the social and economic differences between human groups—primarily races, classes, and sexes—arise from inherited, inborn distinctions and that society, in this sense, is an accurate reflection of biology.” In the aftermath of the civil rights movement, with its embarrassingly racist behavior on the part of some representatives of the majority race and the heartbreaking violence over differences in something as superficial as skin color, it was patently unhip to suggest that racial differences were anything more than just skin deep.

But does that position still serve us now?

In the past 35 years, our population has become more diverse and nutrition science has become more nuanced—but our national nutrition recommendations have stayed exactly the same. The first government-endorsed dietary recommendations to prevent chronic disease were given to the US public in 1977. These Dietary Goals for Americans told us to reduce our intake of dietary saturated fat and cholesterol and increase our intake of dietary carbohydrates, especially grains and cereals in order to prevent obesity, diabetes, heart disease, cancer, and stroke.

Since 1980, the decreases in hypertension and serum cholesterol—health biomarkers—have been linked to Guidelines-directed dietary changes in the US population [1, 2, 3, 4].

“Age-adjusted mean Heart Disease Prevention Eating Index scores increased in both sexes during the past 2 decades, particularly driven by improvements in total grain, whole grain, total fat, saturated fatty acids, trans-fatty acids, and cholesterol intake.” [1]

However, with regard to the actual chronic diseases that the Dietary Guidelines were specifically created to prevent, the Dietary Guidelines have been a resounding failure. If public health officials are going to attribute victory on some fronts to Americans adopting dietary changes in line with the Guidelines, I’m not sure how to avoid the conclusion that they also played a part in the dramatic increases in obesity, diabetes, stroke, and congestive heart failure.

If the Dietary Guidelines are a failure, why have policy makers failed to change them?

It is not as if there is an overwhelming body of scientific evidence supporting the recommendations in the Guidelines. Their weak scientific underpinnings made the 1977 Dietary Goals controversial from the start. The American Society for Clinical Nutrition issued a report in 1979 that found little conclusive evidence for linking the consumption of fat, saturated fat, and cholesterol to heart disease and found potential risks in recommending a diet high in polyunsaturated fats [5]. Other experts warned of the possibility of far-reaching and unanticipated consequences that might arise from basing a one-size-fits-all dietary prescription on such preliminary and inconclusive data: “The evidence for assuming that benefits to be derived from the adoption of such universal dietary goals . . . is not conclusive and there is potential for harmful effects from a radical long-term dietary change as would occur through adoption of the proposed national goals” [6]. Are the alarming increases in obesity and diabetes examples of the “harmful effects” that were predicted? It does look that way. But at this point, at least one thing is clear: in the face of the deteriorating health of Americans and significant scientific evidence to the contrary, the USDA and HHS have continued to doggedly pursue a course of dietary recommendations that no reasonable assessment would determine to be effective.

But what does this have to do with race?

Maintaining the myth that a one-size diet approach works for everyone is fine if that one-size works for you—socially, financially, and in terms of health outcomes. The single positive health outcome associated with the Dietary Guidelines has been a decrease in heart attacks—but only for white people.

And if that one-size diet doesn’t fit in terms of health, if you end up with one of the other numerous adverse health effects that has increased in the past 35 years, if you’re a member of the mostly-white, well-educated, middle/upper-middle class demographic—you know, the one represented in the datasets that we continue to use as the backbone for our nutrition policy—you are likely to have the financial and social resources to eat differently from the Guideline recommendations should you choose to do so, to exercise as much as you need to, and to demand excellent healthcare if you get sick anyway. Even if you accept that these foods are Guidelines-recommended “healthy” foods, you are not stuck with the commodity crop-based processed foods for which our nutrition programs have become a convenient dumping ground.

In the meantime, low-income women, children, and minorities and older adults with limited incomes—you know, the exact population not represented in those datasets—remain the primary recipients of federal nutrition programs. Black, Hispanic, and American Indian kids are more likely to qualify for free or reduced-price school lunches; non-white participants make up 68% of the Special Supplemental Nutrition Program for Women, Infants, and Children enrollment. These groups have many fewer social, financial, and dietary options. If the food they’re given doesn’t lead to good health—and there is evidence that it does not—what other choices do they have?

When it comes to health outcomes in minorities and low-income populations, the “healthier” you eat, the less likely you are to actually be healthy. Among low-income children, “healthy eaters” were more likely to be obese than “less-healthy eaters,” despite similar amounts of sedentary screen time. Among low-income adults, “healthy eaters” were more likely to have health insurance, watch less television, and to not smoke. Yet the “healthy eaters” had the same rates of obesity as the “less-healthy heaters” and increased rates of diabetes, even after adjustment for age.

These associations don’t necessarily indicate a cause-effect relationship between healthy eating and health problems. But there are other indications that being a “healthy eater” according to US Dietary Guidelines does not result in good health. Despite adherence to “healthy eating patterns” as determined by the USDA Food Pyramid, African American children remain at higher risk for development of diabetes and prediabetic conditions, and African American adults gain weight at a faster pace than their Caucasian counterparts [7,8].

Adjusted 20-year mean weight change according to low or high Diet Quality Index (DQI) scores [8]

In this landmark study by Zamora et al, “healthy eaters” (with a high DQI) were compared to “less-healthy eaters” (with a low DQI). Everyone (age 18-30 at baseline) gained weight over time; the slowest gainers—white participants who were “healthy eaters”—still gained a pound a year. More importantly however, for blacks, being a “healthy eater” according to our current high-carbohydrate, low-fat recommendations actually resulted in more weight gain over time than being a “less healthy eater,” an outcome predicted by known differences in carbohydrate metabolism between blacks and whites [9].

Clearly, we need to expand our knowledge of how food and nutrients interact with different genetic backgrounds by specifically studying particular racial and ethnic subpopulations. Social equality does not negate small but significant differences in biology. But it won’t matter how much diversity we build into our study populations if the conclusions arrived at through science are discarded in favor of maintaining public health nutrition messages created when most human beings studied were of the adult, mostly white, mostly male variety.

Right now the racial demographics of the participants in an experimental trial or an observational study dataset doesn’t matter, and the reason it doesn’t is because the science doesn’t matter. What really matters? Maintaining a consistent public health nutrition message—regardless of its affect on the health of the population—that means never having to say you’re sorry for 35 years of failed nutritional guidance.

*ARIC – Atherosclerosis Risk In Communities (1987), 73% white; MESA – Multi Ethnic Study of Atherosclerosis (2000), 38% white, 28% African American, 12% Chinese, 22% Hispanic; CARDIA – Coronary Artery Risk Development in Young Adults (1985), 50% black, 50% white; SHS – Strong Heart Study (1988), 100% Native American; BWHS – Black Women’s Health Study(1995), 100% black women.

References:

1. Lee S, Harnack L, Jacobs DR Jr, Steffen LM, Luepker RV, Arnett DK. Trends in diet quality for coronary heart disease prevention between 1980-1982 and 2000-2002: The Minnesota Heart Survey. J Am Diet Assoc. 2007 Feb;107(2):213-22.

2. Hu FB, Stampfer MJ, Manson JE, Grodstein F, Colditz GA, Speizer FE, Willett WC. Trends in the incidence of coronary heart disease and changes in diet and lifestyle in women. N Engl J Med. 2000 Aug 24;343(8):530-7.

3. Fung TT, Chiuve SE, McCullough ML, Rexrode KM, Logroscino G, Hu FB. Adherence to a DASH-style diet and risk of coronary heart disease and stroke in women. Arch Intern Med. 2008 Apr 14;168(7):713-20. Erratum in: Arch Intern Med. 2008 Jun 23;168(12):1276.

4. Briefel RR, Johnson CL. Annu Rev Nutr. 2004;24:401-31. Secular trends in dietary intake in the United States.

5. Broad, WJ. NIH Deals Gingerly with Diet-Disease Link. Science, New Series, Vol. 204, No. 4398 (Jun. 15, 1979), pp. 1175-1178.

6. American Medical Association. Dietary goals for the United States: statement of The American Medical Association to the Select Committee on Nutrition and Human Needs, United States Senate. R I Med J. 1977 Dec;60(12):576-81.

7. Lindquist CH, Gower BA, Goran MI Role of dietary factors in ethnic differences in early risk of cardiovascular disease and type 2 diabetes. Am J Clin Nutr. 2000 Mar; 71(3):725-32.

8. Zamora D, Gordon-Larsen P, Jacobs DR Jr, Popkin BM. Diet quality and weight gain among black and white young adults: the Coronary Artery Risk Development in Young Adults (CARDIA) Study (1985-2005). American Journal of Clinical Nutrition. 2010 Oct;92(4):784-93.

9. Hite AH, Berkowitz VG, Berkowitz K. Low-carbohydrate diet review: shifting the paradigm. Nutr Clin Pract. 2011 Jun;26(3):300-8. Review.

Why Fat is Still a Feminist Issue

Sing along when the chorus rolls around (with apologies to Helen Reddy):

Yes I ate brown rice
And anything whole grain
Yes I’ve exercised
And look how much I’ve gained
If I have to, I won’t eat anything
I am fat
I am invisible
I am WOMAAAAAAAN!

The United Nations declared 1975 to be International Woman’s Year. Unfortunately, we haven’t really come a long way, baby, since then. Right now, I’m going to sidestep the whole media-generated body image issue, the glass labyrinth, the mommy wars, the “strong is the new sexy” idea (which somehow won out over my own personal favorite “smart is the new sexy” with campaign ads of slightly-unwashed-looking ladies without pedicures huddled over lab benches) and all the other complexities of contemporary feminist theory, and just focus on one little segment of how our national nutrition recommendations might have sucked the life out of women in general for the past 30 plus years.

We’ve been acting like the whole low-fat/low-glycemic/low-carb/paleo/whatever nutrition argument is a PubMed duel between scientists, and the fact that we are surrounded by lousy, nutrient-poor, cheap food is the fault of the Big Evil Food Industry. Let’s focus our attention regarding the current health crisis in America where it really belongs: on short-sighted, premature, poorly-designed (albeit well-intentioned) public health recommendations that were legitimized with the 1977 Dietary Goals for Americans and institutionalized as US policy beginning with the 1980 Dietary Guidelines for Americans.  Yes, fat is still a feminist issue.  But I’m not talking about body fat.

The scientific underpinnings for these recommendations came primarily from studies done with white men. And although the science conducted on these white guys was generally inconclusive, the white guys in Washington—in an attempt to prevent what they saw as a looming health crisis in America—recommended that Americans consume a diet high in carbohydrates and low in fat. And although these premature recommendations have certainly not prevented any health crises in America (the appearance seems to be just the opposite, see: Public Health Nutrition’s Epic Fail), they’ve also had serious repercussions in other respects for the rest of us, i.e. the ones of us who are not white men. [Please don’t take this as a “I hate white guys” thing; I love white guys. I gave birth to two of them.] I’m going to get into the “not white” part of the equation in another post (perhaps unimaginatively titled, Why Nutrition a Racial Issue), but let me focus just on the “not men” part.

For those of us who are not men (and mostly not poor and not part of a minority group), the 1970’s brought us Charlie’s Angels and the Bionic Woman. Women were given the message that we should be able to do and have “it all” (whatever “it all” was). The expectation was that you could “bring home the bacon, fry it up in a pan” and be thin, gorgeous, and sexy (and white) while you did it.

[circa 1980]

Only now bacon (and eggs for that matter) was forbidden, and as the eighties evolved into the nineties, breakfast became granola bars or rice cakes, nibbled virtuously while we drove the kids to school on our way to the job where we got paid less than the men with whom we worked. All the while, we were convinced that we could continue to fit into our tailored power suits by eating a diet that wasn’t designed with our health in mind.

[bacon eggs frowny face, circa 1984]

As with nearly every other aspect in the fight for equal opportunities and treatment, our health as women was based on a single shiny little myth: success would come to those who were willing to work hard, sacrifice, and follow the rules. Airbrushed media images of buns of steel and boobies of plastic sold a diet-exercise message based on an absurdly crude formula—”calories in, calories out”— with one simple rule that would guarantee success: “eat less and move more.”

So we did. We ate less and exercised more and got tired and hungry and cranky—and when all that work didn’t really work in terms of giving us the bodies we were told we should have, we bought treadmills and diet pills, Lean Cuisines and leg warmers. We got our health advice from Jane (“feel the burn”) Fonda and Marie (“I’m a little bit country”) Osmond. We flailed through three decades of frustration, culminating— unsurprisingly enough—in the self-flagellation of Spanx® and the aptly-named Insanity®.

[Jane Fonda circa 1982]

Some of us “failed” by eating more (low-fat, high-carb) food and getting fat, and some of us “succeeded” by developing full-blown eating disorders, and some of us fought the battle and won sometimes and lost other times and ended up with closets full of size 6 (“lingering illness”) to size 26 (“post pregnancy number 3”) clothes. Most of us—no matter what the result—ended up spending a great deal of time, money, and energy trying to follow the rules to good health with the deck stacked against us. If we got fat, we blamed ourselves, and if we didn’t get fat it was because we turned our lives into micromanaged, most-virtuous eater/exerciser contests. Either way, our lives were reduced, distracted, and endlessly unsatisfying.  We were hungry for more in so many ways and aching for rest in so many others, but our self-imposed denial and exhaustion allowed us to control, at least for a bit, the one thing we felt like we could control, that we’d fought to be able to control:  our bodies.

We stopped cooking and started counting. We stopped resting and playing and started exercising. We stopped seeing food as love and started seeing it as the enemy. We didn’t embrace these bodies that were finally, tenuously, ours; we fought them too.

Access to high quality nutrition has always been divided along gender lines [1].  There was a time–not that long ago–in our world when men, by virtue of their size, stature, place as breadwinner (i.e. because of their “man-ness”) were entitled to a larger piece of meatloaf than their sisters (a practice that persists in many cultures still).  How many of us (of a certain age) have heard, “Let you brother have the last piece of chicken, he’s a growing boy”?  Now–conveniently–women would do their own restricting.  Gloria Steinem, with a fair amount of prescience that seems to predict the epigenetic contributions of diet to obesity, noted in her 1980 essay The Politics of Food:*

“Millions of women on welfare eat a poor and starchy diet that can permanently damage the children they bear, yet their heavy bodies are supposed to signify indulgence.  Even well-to-do women buy the notion that males need more protein and more strength.  They grow heavy on sugar and weak on diets . . . Perhaps food is still the first sign of respect–or the lack of it–that we pay to each other and to our bodies.”

Dieting and exercising not only provided a massive distraction and timesuck for women, it helped maintain a social order that the feminist movement otherwise threatened to undermine, one where women were undernourished and overworked, in a word: weak.

And when the scientists finally got around to testing the whole low-fat thing on (80% white) women? The verdict, published in  2006, looked like this:

The results, published in the Journal of the American Medical Association, showed no benefits for a low-fat diet. Women assigned to this eating strategy did not appear to gain protection against breast cancer [2], colorectal cancer [3], or cardiovascular disease [4]. And after eight years, their weights were generally the same as those of women following their usual diets [5].

But it was too late. We’d raised a generation of daughters who look at us and don’t want to be us, but they don’t know how to cook and they don’t know what to believe about nutrition and they too are afraid of food. Some end up drinking the same Kool-Aid we did, except that—in the hubris of a youth that doesn’t contain hallucination-inducing sleep deprivation from babies and/or stress and/or a career on life-support, where diet and exercise and rest are, like Peter Frampton’s hair, a dim memory—they think they will succeed where we failed. Or maybe they’ve found the vegan-flavored or paleo-flavored Kool-Aid. But they are still counting and exercising and battling.

White women have been [irony alert] scientifically proven to be more likely to closely follow the high-carb, low-fat dietary ideal set forth by the Dietary Guidelines than any other demographic [6]. (Black guys—who may not be all that convinced that rules created by the US government are in their best interests, given some history lessons—are likely to have the lowest adherence.) White women apparently are really good at following rules that were not written with them in mind and which have not been shown to offer them any health benefits whatsoever (but which have proven immensely beneficial for the food and fitness—not to mention pharmaceutical—industries). The best little rule-followers of all are the dietitians of the Academy of Nutrition and Dietetics (87% white women), who heartily endorsed the 2010 Dietary Guidelines, which reinforced and reiterated 30 years of low-fat, high-carb dogma despite the Harvard-based science that demonstrated that it offered no benefits to women. (Interesting tidbit: The Academy of Nutrition and Dietetics has elected two male presidents in the past decade despite the fact that men make up only 5% of the membership. My husband thinks the organization has “daddy issues.”)

In 2010, the American Medical Association recommended that women of normal weight (that’s less than 40% of us, by the way) who wanted to stay that way “while consuming their usual diet” (i.e. low-fat, high carb) would have to exercise for an hour a day

[Other reassuring conclusions from that study: There was an overall weight gain over the 13-year time frame. Exercising for anything less than 7 hours per week was associated with weight gain over time. If a woman was already fat, increased exercise was more likely to be related to increased weight than weight loss.  If these messages don’t scream to women all over America, “GIVE UP NOW!!!” I don’t know what would. By the way, those of us who go out and skip and jump and run because we like to and it makes our hearts truly happy are not exercising. We’re playing. I love to wave at those women from my couch.**]

But let’s get back to that hour a day for just a second.

Take a look at a recent study by Dr. David Ludwig, out of Harvard. It demonstrated that people who had recently been dieting (something that would apply to almost every woman in America), and were eating a low-fat diet, had to add an hour a day of exercise in order to keep their “calories in, calories out” balanced, while those on a reduced-carbohydrate diet expended that same amount of energy just going about their business.

What is all the women in the world who have been unsuccessfully battling their bulge woke up tomorrow morning and said:

I want my hour a day back?

For those of us who do not want to exercise for an hour just to maintain our weights or for those of us for whom exercise isn’t doing a damn thing except making us hungry and cranky and tired while we gain weight, we don’t have to. Instead, we can eat fewer of those USDA/HHS/dietitian-pushed, nutritionally-pathetic, low-fat whole-grain carbohydrate foods and more truly nourishing food and do whatever we please with that extra hour.

Who knows what changes we can make to a world that desperately needs our help?  In America alone, this would mean giving around–ooh let’s just say–50 million adult women an extra hour a day. That’s an extra 365 hours a year per woman, an extra 18 billion hours of womanpower a year total.

We could stop exercising and start playing. Stop counting calories and start enjoying feeling nourished. Start putting the love back into our food and embracing the bodies we have and the bodies of the men, women, and children all around us. I know that some of us would find that hour well spent just napping. Others of us might use that hour to figure out how to dismantle the system that stole it from us in the first place.

I can bring home the bacon, fry it up in a pan. And eat it.

******************************************************************************

In my own personal celebration of Asskicking Women of Food, I think (I hope) my next post will be:  The Grande Dames (Goddesses? Queens?) of Nutrition

*Thanks to Gingerzingi for bringing this to my attention.  What a great essay–look for it in a collection entitled Outrageous Acts and Everyday Rebellions.

**I have absolutely nothing against activities that bring inner/outer strength and happiness.  But exercise in the 80s and 90s was not about being happy or strong–it was about punishing ourselves (feel the burn? seriously?) in order to win at a game–being in total control of everything in our lives from babies to bodies to boardrooms–whose rules were created within the very social construct we were trying to defeat.

References:

1.  Bentley, Amy (1996) Islands of Serenity: Gender, Race, and Ordered Meals during World War II. Food and Foodways 6(2):131-156.

2. Prentice RL, Caan B, Chlebowski RT, et al. Low-fat dietary pattern and risk of invasive breast cancer: the Women’s Health Initiative Randomized Controlled Dietary Modification Trial. JAMA. 2006; 295:629-42.

3. Beresford SA, Johnson KC, Ritenbaugh C, et al. Low-fat dietary pattern and risk of colorectal cancer: the Women’s Health Initiative Randomized Controlled Dietary Modification Trial. JAMA. 2006; 295:643-54.

4. Howard BV, Van Horn L, Hsia J, et al. Low-fat dietary pattern and risk of cardiovascular disease: the Women’s Health Initiative Randomized Controlled Dietary Modification Trial. JAMA. 2006; 295:655-66.

5. Howard BV, Manson JE, Stefanick ML, et al. Low-fat dietary pattern and weight change over 7 years: the Women’s Health Initiative Dietary Modification Trial. JAMA. 2006; 295:39-49.

6.  Sijtsma FP, Meyer KA, Steffen LM et al.  Longitudinal trends in diet and effects of sex, race, and education on dietary quality score change: the Coronary Artery Risk Development in Young Adults study. Am J Clin Nutr. 2012 Mar;95(3):580-6. Epub 2012 Feb 1.

Just Asking the Question

 

 

So wouldn’t it be cool if we could ask folks on the street what they think caused the obesity crisis, and then show them this and ask them again?

Now back to your regularly scheduled blob.

Data from:  Centers for Disease Control and Prevention (CDC).  National Center for Health Statistics, Division of National Health and Nutrition Examination Surveys.  Prevalence of Overweight, Obesity, and Extreme Obesity Among Adults: United States, Trends 1976–1980 Through 2007–2008. 

N of 1 Nutrition Part 2: Biochemistry and Nutrition Policy – The Great Divorce

Full disclosure: I happen to love biochemistry. I have a favorite transcription factor (ChREBP) and a favorite neurotrophic factor (BDNF). I think proteins are beautiful. If I were a biochemist who had discovered a novel protein, I would carry a picture of it around with me in my wallet.

An absolutely fabulous (looking) protein.

The animal and cells models used in biochemistry are great for looking at genetics, epigenetics, at biological mechanisms, and how these things interact. We can manipulate these models in ways that we can’t with humans, and this has given us some crucial insights into mechanisms, especially neural and epigenetic ones—critical to understanding the effects of nutrition—that would be virtually impossible to study in humans.

Nutritional biochemistry can also wear the mantle of “objective-er than thou” when it comes to science. As one of the biochem profs at UNC noted: If you have to use statistics to discuss the results of your experiment, you need to redesign your experiment. Sure, the questions asked, the interpretation of results, and what gets published in biochem are influenced by funding sources, social/scientific contexts and dominant paradigms. But unless you are a truly bad scientist, you can’t make the experimental results come out in a way that supports your hypothesis.

(This is in marked contrast to observational studies in nutrition epidemiology where the whole point of the data analysis “experiment” is to find results that support your hypothesis. Sometimes you don’t find them, and those findings should be reported, although they may not be because who’s to know?  Just you and your SAS files. My point is that you are actively seeking results that confirm a particular idea, and this just might influence what “results” are found. More on this in another post.)

But beyond the utility and elegance of nutritional biochemistry, the problems with regard to health policy are two-fold.

The first problem: In many ways, nutrition policy has become almost completely divorced from the basic science investigations done in biochemistry. The Dietary Guidelines Advisory Committee (DGAC)—the committee of scientists that, at least theoretically, reviews the science upon which the US Dietary Guidelines are based—started in 1985 as mostly MDs and biochemistry professors. As time went on, the DGAC became more heavily populated with epidemiologists. This would be fine if epidemiology was meant to generate conclusive (or even semi-conclusive) results. It isn’t. Epidemiology gives us associations and relationships that are meant to be understood through a reasonably plausible, preferably known, biological mechanism. Note these interesting conclusions from the 2010 DGAC Report and the 2010 Dietary Guidelines policy document with regard to dietary cholesterol:

Here’s our mechanism: Exogenous, or dietary, cholesterol down-regulates cholesterol synthesis in the liver to maintain cholesterol balance.”
[D3-1, Reference 1, emphasis mine]

Here’s our epidemiology: Traditionally, because dietary cholesterol has been shown to raise LDL cholesterol and high intakes induce atherosclerosis in observational studies, the prevailing recommendation has been to restrict dietary cholesterol intake, including otherwise healthy foods such as eggs.”
[D3-2, Reference 1, emphasis mine, “induce”? really? how does one “observe” that cholesterol “induces” atherosclerosis? I’m assuming committee fatigue had set in at this point because that word should have been “are associated with”]

Here’s our policy recommendation: Consume less than 300 mg per day of dietary cholesterol.”
[Ch. 3, p. 21, Reference 2]

See, wasn’t that easy?

This brings me to the second problem, which is sort of the flip-side of the first: Biochemical processes that are understood primarily through mouse or cell models only work as the basis for dietary recommendations for chronic disease if you’re making them for cells or mice.

As one of my favorite professors in the Nutrition department likes to quip, “We know how to cure obesity—in mice. We know how to cure diabetes—in mice. We have all the knowledge we need to keep our rodent population quite healthy.” Obviously this knowledge has not been translatable to humans. In some ways, basic nutrition biochemistry should be divorced from public health policy.

The reason for this is that the equivalency of animal models to humans is limited in ways that go beyond simple biological comparisons—although the biological differences are significant.

Mouse large intestinal tract, courtesy of Comparative Anatomy and Histology: A Mouse and Human Atlas, edited by Piper M. Treuting, Suzanne M. Dintzis

My knowledge of comparative physiology is limited at best, but my understanding is that most rodents used in nutrition biochemistry work (rats included) have a cecum (an intestinal pouch that facilitates the breakdown of cellulose), an adaptation that would be necessary in a diet composed of hard-to-digest plant material such as seeds and grains. Because this process is not terribly efficient, many rodents also recycle nutrients by eating their feces. Humans don’t have a functional cecum for fermentation; we don’t tend to reingest our own poops (or anyone else’s poop, unless you’re starring in a John Waters film) in order to extract further nutrition from them as our bodies are already very efficient at this during the first go-round.

Furthermore, due to inherent difference in physiology, animals may not accurately model the physiological conditions that produce disease in humans. For example, in some species of rodents, a high fat diet will induce insulin resistance, but there is no definitive evidence that higher fat intake per se impairs insulin sensitivity in humans [3]. Why this is so is not entirely clear, but likely has something to do with the diet each species has consumed throughout its evolution. In a natural setting, rodents may do well on a diet of mostly grains. On the other hand, humans in a natural setting would do okay on a diet of mostly rodents.

What is more critical is that animal and cell life can’t imitate the complex environmental inputs that humans encounter throughout their lives and during each day. Animals and cells only get to consume what they are given. If you’ve ever been at a conference where the breakfast is low-fat muffins, whole grain bagels, fat-free yogurt, orange juice, and fruit, you know what that feels like. But typically our food choices are influenced by a multitude of factors. Mice, unlike humans, cannot be adversely affected by labeling information on a box of Lucky Charms.

Mice don’t know that whole grains are supposed to be good for you.
Bad on them.

Does that matter? You bet it does.

Where do most Americans get their nutrition information these days? From media sources including the internet, from their grocery stores, from the packages holding the food they buy. People who have never read a nutrition book, much less the actual Dietary Guidelines, still “know” fat is bad and whole grain is good [4, 5]. These environmental exposures affect food choices. Whether or not the person still decides to consume food with a high fat content depends on another set of cultural factors that might include socioeconomic status, education, race or ethnicity, age, gender—in other words, things we can’t even begin to replicate in animal or cell models.

Human biochemistry is unique and complex, as are our social and cultural conditions, making it very difficult to study how these primary contributors to health and food choices are related to each other.

Can we do a better job with nutritional epidemiology? I know you’re on the edge of your seat waiting for the next episode in the unfolding drama, N of 1 Nutrition, when we get to hear Walter Willett say:

“I never met a statistical man I didn’t like.”

Stay tuned.

References:

1. U.S. Department of Agriculture. Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans 2010. Accessed July 15, 2010. http://www.cnpp.usda.gov/DGAs2010-DGACReport.htm

2. U.S. Department of Agriculture and U.S. Department of Health and Human Services. Dietary Guidelines for Americans, 2010. http://www.cnpp.usda.gov/DGAs2010-PolicyDocument.htm Accessed January 31, 2010

3. Report of the Panel on Macronutrients, Subcommittees on Upper Reference Levels of Nutrients and Interpretation and Uses of Dietary Reference Intakes, and the Standing Committee on the Scientific Evaluation of Dietary Reference Intakes. Dietary Reference 4. Intakes for Energy, Carbohydrate, Fiber, Fat, Fatty Acids, Cholesterol, Protein, and Amino Acids (Macronutrients). Washington, DC: The National Academies Press; 2005.

4. Eckel RH, Kris-Etherton P, Lichtenstein AH, Wylie-Rosett J, Groom A, Stitzel KF, Yin-Piazza S. Americans’ awareness, knowledge, and behaviors regarding fats: 2006-2007. J Am Diet Assoc. 2009 Feb;109(2):288-96.

5. Marquart L, Pham AT, Lautenschlager L, Croy M, Sobal J. Beliefs about whole-grain foods by food and nutrition professionals, health club members, and special supplemental nutrition program for women, infants, and children participants/State fair attendees. J Am Diet Assoc. 2006 Nov;106(11):1856-60.

N of 1 Nutrition Part 1: Same Old Tools

I’ve been thinking a lot about tools lately.  This actually has nothing to with the ongoing fascinating-in-a-train-wreck-sort-of-way paleo soap opera, although I have been reading Audre Lorde’s essay “The Master’s Tools will Never Dismantle the Master’s House” and loving it.  I have all kinds of things to say about feminism and nutrition (yeah, I’m going to go there), but there are all kinds of tools and we’re going to have to talk about all of them eventually.  Today, I’ll start with the scientific kind.  

At Ancestral Health Symposium 2012 there was, among other things, a great deal of discussion about what diet works “best:” primal, paleo, neopaleo (my friend Andrea invented that one), safe starch, low-carb, no-carb, etc. The reality is that, in terms of being able to make sweeping generalizations about which dietary pattern will work best for everyone, we as nutrition scientists and clinicians actually sorta suck. Other than describing very general recommendations for essential nutrition—amino acids, fatty acids, vitamins and minerals, and even these have a wide variability in individual requirements—we simply do not have the skills, the tools, or the knowledge to make sweeping dietary recommendations that do not come with the very real possibility of unintended negative consequences for an individual who might follow them.

Choline is a great example of what happens when you mix individual variation with universal recommendations:

Although our body makes some choline, we still require a dietary supply of this important nutrient.* Eggs are a primary source of dietary choline. The past 30 years of Dietary Guidelines have frightened us into reducing egg consumption and/or using egg substitutes that replace the yolk (where the choline is) with soybean oil in order to prevent heart disease, even though dietary cholesterol has little effect on serum cholesterol [1] and our average cholesterol intake is below recommended levels and has been for 40 years [2]. Nevertheless, egg yolks, a recent headline screamed, are as bad for you as cigarettes.

In response to these scare tactics, Americans have dramatically reduced their egg consumption [3]. As a result, average choline consumption does not meet current recommended standards; less than 4% of women even reach adequate intake levels [4, 5].

This is bad enough, but these adequate intake levels were based on a small study done on adult white males; standards for everyone else, including children, were extrapolated from those results [6]. Post-menopausal females, pregnant women, children, and people with certain genetic polymorphisms (which may exist in more than 50% of the population) may actually have increased needs for choline above and beyond the adequate intake level [7].

It’s hard to say exactly how large the gap between intake and actual needs are for these subpopulations, but I can hazard a guess that as long as whole eggs are discouraged as part of our diets, it will only continue to widen. The fact that dietary choline is needed for the development of  brain cells seems rather ironic in the face of such goofiness.

Brain food? Or death by cholesterol?

When dietary guidance shifted from being about provision of basic nutrition to prevention of chronic disease, we found ourselves using tools that were designed to examine diseases of nutrition deficiency (i.e. diseases with one fairly straightforward cause), to now make recommendations about chronic diseases with long, complex, multi-factorial origins [8]. Everyone deprived of Vitamin C will eventually develop scurvy, but not everyone who avoids cholesterol will also avoid heart disease.  Chronic diseases that result from a complex interplay between the individual and environment are difficult—if not impossible—to examine using our current tools and methods, and assessing an individual’s risk of heart disease and tailoring dietary guidance accordingly is much different from making population-wide recommendations to avoid a food–in this case, eggs–that is a primary source of an essential nutrient.

Our current approach takes the complex reality that is one individual human living his/her life and

  • dials into a discrete mechanism within this complex unit using cell cultures and animal models that can’t even begin to describe the physiological, psychological, and cultural context of a whole complicated individual (nutritional biochemistry), or
  • lumps a complicated individual into a pile with a lot of other complicated individuals and uses a fancy schmantzy computer program or a highly-controlled artificial experimental protocol to paint an simplified, homogenized broad brush stroke of a picture that bears little resemblance to the reality of any of the specific individuals it is supposed to describe (nutrition epidemiology), and then
  • turns these overly-simplified, homogenized descriptions into one-size-fits-all nutrition policy that has never actually been shown to work.

From reality to policy: Four perspectives on nutrition

Everyone is subject to the same biochemical rules—and it’s great to learn more about how these rules work on a mechanistic level—but how those rules play out in any given individual is difficult to predict. Is there a way to use the focus of an experimental intervention without losing the environmental influences present in observational studies, and still create something that will eventually translate into meaningful policy?

Maybe. In next few posts, I take on some of the shortcomings in our current methodology and explore an approach that may help move nutrition science, and thus nutrition policy, into the 21st century.

*Choline acts as a methyl donor in pathways involving gene expression and other metabolic functions; as an important contributor to structural integrity and signaling function in cell membranes, especially those involved in nervous tissue and brain development; as a necessary constituent of lipid metabolism and transport, including VLDL required for the export of fat from the liver; and as the precursor to the neurotransmitter, acetylcholine.

References:

1. Willett, Walter. Nutrition Epidemiology, 2nd edition. 1988.

2. U.S. Department of Agriculture. Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans 2010. Accessed July 15, 2010. http://www.cnpp.usda.gov/DGAs2010-DGACReport.htm

3. U.S. Dept. of Agriculture, Office of Communications. 2001-2002 Agriculture Fact Book. Washington, DC:2003.

4. Jensen H. Choline in the diets of the US population: NHANES, 2003-2004. The FASEB journal: official publication of the Federation of American Societies for Experimental Biology. 2007;21(Meeting Abstract Supplement):lb219.

5. Moshfegh A. Usual Nutrient Intakes of Americans. USDA Whitten Building; 2009.

6. Dietary Reference Intakes for Thiamin, Riboflavin, Niacin, Vitamin B6, Folate, Vitamin B12, Pantothenic Acid, Biotin, and Choline [Internet]. [cited 2012 May 21]. Available from: http://books.nap.edu/openbook.php?record_id=6015

7. Zeisel SH, da Costa K-A. Choline: An Essential Nutrient for Public Health. Nutr Rev. 2009 Nov;67(11):615–23.

8. Harper AE. Killer French Fries. Sciences 1988, 28 (Jan/Feb): 21-27.


From Paleo to Public Health: We have met the enemy and we are them

Believe it or not, when I started this blog post, I wasn’t even thinking about the current sturm und drang in the paleo community. If you follow the paleo world gossip, you already know about it; if you’re not, this cartoon from xkcd.com says it all:

So—speaking of drama—social change stories are often built around drama triangles—also called triangles of power. In these triangles, there are three roles: victim, perpetrator, and rescuer. These roles can morph and change over time and depending on who is telling the story or who the audience is. In addition, a person or entity can be in more than one role at a time. [Note: This doesn’t mean that anyone actually IS a victim, perpetrator, rescuer; this is a construct used to describe a social dynamic, not enforce one.]

From the works of Eric Berne and Stephen Karpman.

We can think about this model in regard to the current commotion in the paleo community, but–more to my point–also in regard to the work we may be able to do as a community should we decide to get our collective act together and worry about something larger than ourselves for a while. (Perhaps we’ll need social media group therapy, culminating in a giant Skype conference call, where everybody joins twitter feeds and sings Kumbaya?)

There is value in the power of story-telling; the drama is part of what makes us want to be involved in cause. We can typically identify with the victim or the rescuer, or both; the perpetrator gives us a bad guy in an undeniably black hat on which to focus our things-we-love-to-hate passion. Policymakers often prefer stories to logical arguments; many of us do. But stories can also create false simplicity and black and white reasoning. They can create artificial walls and boundaries. Most dangerously for the nutrition reform movement, these stories can create a lack of respect for those we are trying to help (“We know what is best for you”) and a lack of humility with regard to our own fallibility (“We have the “right” answers this time”).

As nutrition reformers—from paleo to public health—what story are we going to tell?

We must be sensitive in our choice of who we place in the “victim” role. The “victim” is the one that pulls at our heartstrings, that gives the story its emotional weight. I think the real victims in the nutrition reform story are our next generation, the children who are not yet born but who will bear the burdens of a broken food-health system as much of the American public gets caught in a cycle of being misled, misfed, misdiagnosed, and mistreated. These are children who will grow up in a nation where the dream of good health belongs to a fortunate few and slips from the grasp of everyone else despite all good intentions and efforts otherwise. And because these particular victims don’t exist (yet), it saves us from the awkward position of “rescuing” people who don’t consider themselves to be victims.

Some people who are suffering from obesity and poor health today (some of us even) may see themselves as victims and choose to use the sense of outrage at being put in that position to help change the system. But not everyone will choose that role, and I suggest we not take the stance that “poor fat sick people” out there need our help.

It isn’t as if we have a shortage of casualties from the past 30-40 years of USDA/HHS dietary guidance. How about the environment, small farmers, taxpayers, or maybe the scientific integrity of a whole generation of nutrition scientists? In 1978, Dr. Al Harper, from the University of Wisconsin-Madison, warned that the Dietary Goals’ promise of better health for all with no risks, only benefits, had ” great potential for undermining both the science of nutrition and nutrition education” [1]. It would seem that to a large extent, he was right. As a nation, we’ve lost a lot in thirty years.

So who is to blame? Hmm. Good question.

Government?

Policymakers doing what policymakers do: making policy.

Well, it is hard to pin this all on a disembodied “government” because the government does what we allow it to do. As long as we the people allowed segregation, it continued. When we decided that segregation was no longer tolerable, laws were created to end it. Changing attitudes will change the institutions that in turn shape attitudes.

It doesn’t make a lot of sense to blame “the government,” when the general public has not developed a mature sense of healthy skepticism towards the government’s ability to protect us from ourselves. When the first Dietary Goals were released by the McGovern Committee in 1977 and the first Dietary Guidelines released by the USDA in 1980, the public could have refused to believe the low-fat-jello-pie-in-the-sky promises, but they didn’t—for reasons that may be more cultural than scientific in nature. I’m not convinced we would do so under similar circumstances today. Although we may now be more wary of the government’s ability to solve our problems, we tend to still hold out a childish hope that it will anyway. [Funny, to me anyway, story: It seems that a number of us who showed up for the paleo-libertarian dinner at AHS2012 were there less because of our libertarian ideals and more because we were happy to have someone else choosing our dinner destination and making reservations for us. Just a touch of irony there.]

In 1977 and in 1980, policymakers were applying the information that they had at the time to a well-intentioned goal of improving the health of all American; this is just the type of thing we expect from our policymakers. Did they seem to favor one side of the argument? Sure, but do we really think that—if we were in their position—we could work with complete objectivity? We couldn’t; there is no such thing. As we try to change public opinion and government policy, we will be working under the same constraints of humanness they were, with the only added advantage being that we can learn from the unintended consequences of these good intentions.

Industry?

Low-fat, whole grain, fiber-filled box of food: more nutrition information than actual nutrition.

Should we blame “the food industry”? We could.

Gary Taubes tells the story of one of the staff members of the McGovern Committee being approached by an industry analyst who tells him, “if you think people are going to start eating more broccoli and more kale and spinach because you’ve now put together dietary goals, you’re crazy. What you’ve said is people should eat less fat so the industry is going to jump on this and they’re going to create low fat products and they’re going to label them as heart healthy or whatever and they’re going to be able to carve out a portion of the market for their new products and everyone else is going to have to play catch-up and that’s what they’re going to do and the next thing you know you’re going to have shelf after shelf in the supermarket of junk foods that claim to be low fat and good for your heart.” As Gary Taubes points out, that’s exactly what happened. But is this the fault of industry?

Industry follows laws of supply and demand, using government recommendations as a marketing tool. Americans were happy to consume the products designed to lower our cholesterol and prevent heart disease then, because we thought doing so would contribute to good health. Now we, as a community hoping to expand our influence out to the rest of America, are happy to consume gluten-free snacks, grass-fed beef, and pemmican—for the exact same reason, because we think doing so will contribute to good health. We might have been sold a bill of goods by the food industry in the past 30 years, but by golly, we bought it.

Addressing the economic engines that make our food-health system go around is part of our challenge in shifting the paradigm. Working with the producers, especially the one at the bottom of the industrialized food chain, and the retailers, who must meet changing consumer demands—rather than lumping everyone together and clamping a big black hat on the whole thing—is a lot more likely to lead to success.

If there is a lesson to be learned here, maybe it is that we should be cautious about what health information we allow to be used on packaging and marketing, no matter what the nutrition paradigm. I don’t agree with Marion Nestle on much, but I agree with her that a box of food is no place for a tutorial on nutrition.

Science?

The only really bad scientists I know.

What about “bad science”? Isn’t that what got us into this mess?

I get the impression that a lot of us would like to blame “mainstream” nutrition—whomever or whatever that is—and the “bad science” it produces. I would offer some strong caution against this.

We want a different nutrition paradigm–specifically “our” paradigm, whatever that will be–to be “mainstream” one day, but it is a very tenuous position to say “they got it all wrong, but don’t worry, this timewe got it right.” All scientists are both trying to make a living and trying to improve the health of Americans. No scientist can control how his/her work is used (or misused) for public health policy. The scientists who have contributed to our current nutritional paradigm have been working–as all scientists do–within a framework shaped by personal experiences, cultural forces, financial pressures, political and career concerns, powerful individuals, and media soundbites.  The next generation of scientists will be no different. When scientists are asked to work on committees that create policy, they do, of course, bring to that work a more comprehensive understanding of their own area of study than of an area that offers a competing view.  The practices behind policy-making are responsible for making sure such views are balanced, not the scientists themselves.

In the early years of the Goals and Guidelines, a number of scientists did complain about the prematurity of those recommendations. I think most of us would like to think we’d be among those skeptics, but I’m not sure that we would. For the most part, people who then worked in the field of nutrition— dietitians, clinicians, young scientists—embraced these new dietary recommendations as progressive and much needed. Dr. Joanne Slavin told me the story of how the younger generation in her Department of Nutrition at the University of Minnesota thought Dr. Harper (see the quote above) was “behind the times” because he didn’t think it was such a great idea to tell everyone to reduce their fat intake. When we established policy to give an institutional framework to an ideal that was waiting for the science to catch up with it, we failed to prepare for the possibility that we might be wrong. If there is one lesson to learn from the past 30 years of interaction between nutrition science and public health policy, it is that we should prepare for that possibility.

Us?

To a large extent, the cultural forces that shaped our thinking about nutrition (and which in turn helped carry the scientific, policy, and industrial forces forward) were an extension of the culture wars of the 60s and 70s: suits vs. hippies. The suits (maybe the “lab coats”?) were the stodgy pinhead scientists, fiddling away in their labs, waiting to get the science “right,” while the country went to hell in a hamburger. The “hippies” of the McGovern committee—along with popular figures like Frances Moore Lappé, author of the wildly popular vegetarian cookbook, Diet for a Small Planet –saw changing the diet of Americans as a moral imperative that eclipsed concerns over the weak associations with diet and disease outcomes. This gave the low-fat diet an Age of Aquarius glow that offered a shiny new hope for ending chronic disease, and we swallowed it hook, line, and sinker.

Labeled the “barefoot boys of nutrition,” the creators of our first national dietary recommendations were a team of young, energetic, long-haired (for DC anyway)—and not coincidentally, white, well-educated, upper/middle class and male—idealists hoping to convince Americans to eat a more “natural” diet, a vision of the lead writer for the group, Nick Mottern, who remains a staunch advocate of minimally-processed foods (and who has never, by the way, been a vegetarian) [2,3]. With the exception of the food from animals vs. food from plants orientation (and I think we have more women in places of influence), how different is the paleo community from these origins?

In other words, in the immortal words of Pogo: We have met the enemy and he is us. “Us” is (upper) middle class, well-educated, young white people with an idealistic plan to change the world for the better. Now of course I don’t mean you or me personally. We can all find ways to excuse ourselves from this stereotype (I for one can claim that I’m not young—but otherwise, the description pretty much fits me exactly). But there is a lesson here to be learned: in creating an “enemy” to fight in the nutrition revolution, we had better choose very carefully. Let’s choose an “enemy” we actually want to eliminate permanently (i.e. not us).

I suggest that we not make a person, a group, an entity, or an institution either scapegoats or the enemy. Then who or what is to blame? What do we want to get rid of entirely?

Well, how about poorly-designed policy? Maybe one-size-fits-all guidelines (assuming we can agree that this concept should be eliminated)? Maybe a food-health system that lacks transparency, public involvement, and checks and balances? Maybe we could get rid of the framework that excludes the concept of food culture from any discussions about food policy?

If we can do that, it opens up the last piece of the triangle–the “rescuers”–to anyone who cares about the health of Americans: policymakers, health professionals, the public, food producers and manufacturers, scientists (even the nutrition epidemiologists whose science many of us love to hate), or, umm, maybe even each other.   If we can see a place for all of these groups, and all of us already in the “alternative nutrition” community, in shifting the future of America away from policies that have created little hope for the health of our next generation, we may begin to see them as allies (or at least future allies), rather than enemies. As such, we can enlist their help rather than trying to blame them or defeat them.

Right now I’m thinking we may need to try this out in our own little paleo/low-carb/WAPF/etc. communities first.

1. Harper AE. Dietary goals-a skeptical view. Am J Clin Nutr. 1978 Feb;31(2):310-21.

2. Broad, WJ. Jump in Funding Feeds Research on Nutrition. Science, New Series, Vol 204. No. 4397 (June 8, 1979). Pp. 1060-1061 + 1063-1064.

3. Mottern, N. Correspondence.