Dietary Guidelines for Americans: We don’t need no stinkin’ science

I know, I know. I never post. I never call. I don’t bring you flowers. It’s a wonder we’re still together. I have the usual list of excuses:


But before I disappear off the face of the interwebz once again, I thought I share with you a quickie post on the science behind our current Dietary Guidelines. Even as we speak, the USDA and DHHS are busy working on the creation of the new 2015 Dietary Guidelines for Americans, which are shaping up to be the radically conservative documents we count on them to be.

For just this purpose, the USDA has set up a very large and impressive database called the Nutrition Evidence Libbary (NEL), where it conducts “systematic reviews to inform Federal nutrition policy and programs.” NEL staff collaborate with stakeholders and leading scientists using state-of-the-art methodology to objectively review, evaluate, and synthesize research to answer important diet-related questions in a manner that allows them to reach a conclusion that they’ve previous determined is the one they want.

It’s a handy skill to master. Here’s how it’s done.

The NEL question:

What is the effect of saturated fat intake on increased risk of cardiovascular disease or type 2 diabetes?

In the NEL, they break the evidence up into “cardiovascular” and “diabetes” so I’ll do the same, which means we are really asking: What is the effect of saturated fat (SFA) intake on increased risk of cardiovascular disease?

Spoiler alert–here’s the answer: “Strong evidence” indicates that we should reduce our intake of saturated fat (from whole foods like eggs, meat, whole milk, and butter) in order to reduce risk of heart disease. As Gomer Pyle would say, “SUR-PRIZE, SUR-PRIZE.”

Aaaaaaaand . . . here’s the evidence:

The 8 studies rated “positive quality” are in blue; the 4 “neutral quality” studies are in gray. The NEL ranks the studies as positive and neutral (less than positive?), but treats them all the same in the review. Fine. Whateverz.

According the exclusion criteria for this question, any study with a dropout rate of more than 20% should be eliminated from the review. These 4 studies have dropout rates of more than 20%. They should have been excluded. They weren’t, so we’ll exclude them now.

Also, according to NEL exclusion criteria for this question, any studies that substituted fat with carbohydrate or protein, instead of comparing types of fat, should be excluded. Furtado et al 2008 does not address the question of varying levels of saturated fat in the diet. In fact, saturated fat levels were held constant–at 6% of calories–for each experimental diet group. So, let’s just exclude this study too.

One study–Azadbakht et al 2007–was conducted on teenage subjects with hypercholesterolemia, a hereditary condition that affects about 1% of the population. Since the U.S. Dietary Guidelines are not meant to treat medical conditions and are meant for the entire population, this study should not have been included in the analysis. So let’s take care of that for those NEL folks.


In one study–Buonacorso et al 2007–total cholesterol levels did not change when dietary saturated fat was increased: “Plasma TC [total cholesterol] and triacylglycerol levels were NS [not significantly] changed by the diets, by time (basal vs. final test), or period (fasting vs. post-prandial) according to repeated-measures analysis.” This directly contradicts the conclusion of the NEL. Hmmmm. So let’s toss this study and see what’s left.

In these four studies, higher levels of saturated fat in the diet made some heart disease risk factors get worse, but other risk factors got better. So the overall effect on heart disease risk was mixed or neutral. As a result, these studies do not support the NEL conclusion that saturated fat should be reduced in order to reduce risk of heart disease.


That leaves one lone study. A meta-analysis of eleven observational studies. Seeing as the whole point of a meta-analysis is to combine studies with weak effects to see if you end up with a strong one, if saturated fat was really strongly associated with heart disease, we should see that, right? Right. What this meta-analysis found was that among women over 60, there is no association between saturated fat and coronary events or deaths. Among adult men of any age, there is no association between saturated fat and coronary events or deaths. Only in women under the age of 60 is there is a small inverse association between risk of coronary events or deaths and the reduction of saturated fat in the diet. That sounds like it might be bad news—at least for women under 60—but this study also found a positive association between monounsaturated fats—you know, the “good fat,” like you would find in olive oil—and risk of heart disease. If you take the results of this study at face value–which I wouldn’t recommend–then olive oil is as bad for you as butter.

So there’s your “strong” evidence for the conclusion that saturated fat increases risk of heart disease.


Just recently, Frank Hu of the 2015 Dietary Guidelines Advisory Committee was asked what we should make of the recent media attention to the idea that saturated fat is not bad for you after all (see this video at 1:06:00). Dr. Hu reassured us that, no, saturated fat still kills. He went on to say that the evidence to prove this, provided primarily by a meta-analysis created by USDA staffers (and we all know how science-y they can be), is MUCH stronger than that used by the 2010 Committee.

Well, all I can say is:  it must be.  Because it certainly couldn’t be any weaker.



As the Calories Churn (Episode 2): Honey, It’s Not the Sugar

In the previous episode of As the Calories Churn, we looked at why it doesn’t really make sense to compare the carbohydrate intake of Americans in 1909 to the carbohydrate intake of Americans in 1997.  [The folks who read my blog, who always seem to be a lot smarter than me, have pointed out that, besides not being able to determine differing levels of waste and major environmental impacts such as a pre- or early-industrial labor force and transportation, there would also be significant differences in:  distribution and availability; what was acquired from hunted/home-grown foods; what came through the markets and ended up as animal rather than human feed; what other ingredients these carbohydrates would be packaged and processed with; and many other issues.  So in other words, we not comparing apples and oranges; we are comparing apples and Apple Jacks (TM).]

America in 1909 was very different from America in 1997, but America in 1970 was not so much, certainly with regard to some of the issues above that readers have raised.  By 1970, we had begun to settle into post-industrial America, with TVs in most homes and cars in most driveways.  We had a wide variety of highly-processed foods that were distributed through a massive transportation infrastructure throughout the country.

Beginning in the mid-1960s, availability of calories in the food supply, specifically from carbohydrates and fats had begun to creep up.  So did obesity.  It makes sense that this would be cause for concern from public health professionals and policymakers, who saw a looming health crisis ahead if measures weren’t taken–although others contended that our food supply was safer and more nutritious than it had ever been and that public health efforts should be focused on reducing smoking and environmental pollutants.

What emerged from the political and scientific tug-of-war that ensued (a story for another blog post) were the 1977 Dietary Goals for Americans.  These goals told us to eat more grains, cereals and vegetable oils and less fat, especially saturated fat.

Then, around 1977 – 1980, in other words around the time of the creation of the USDA’s recommendations to increase our intake of grains and cereals (both carbohydrate foods) and to decrease our intake of fatty foods, we saw the slope of availability of carbohydrate calories increase dramatically, while the slope of fat calories flattened–at least until the end of the 1990s (another story for another blog post).

[From food availability data, not adjusted for losses.]

The question is:  How did the changes in our food supply relate to the national dietary recommendations we were given in 1977?  Let’s take a closer look at the data that we have to work with on this question.

Dear astute and intelligent readers: From this point on, I am primarily using loss-adjusted food availability data rather than food availability data. Why? Because it is there, and it is a better estimate of actual consumption than unadjusted food availability data. It only goes back to around 1970, so you can’t use it for century-spanning comparisons, but if you are trying to do that, you’ve probably got another agenda besides improving estimation anyway. [If the following information makes you want to go back and make fun of my use of unadjusted food availability data in the previous post, go right ahead. In case you didn't catch it, I think it is problematic to the point of absurdity to compare food availability data from the early 1900s to our current food system—too many changes and too many unknowns (see above).  On the other hand, while there are some differences, I think there are enough similarities in lifestyle and environment (apart from food) between 1970 and 2010 to make a better case for changes in diet and health being related to things apart from those influences.]

Here are the differences in types of food availability data: 

Food availability data: Food availability data measure the use of basic commodities, such as wheat, beef, and shell eggs for food products at the farm level or an early stage of processing. They do not measure food use of highly processed foods– –in their finished form. Highly processed foods–such as bakery products, frozen dinners, and soups—are not measured directly, but the data includes their less processed ingredients, such as sugar, flour, fresh vegetables, and fresh meat.

Loss-Adjusted Food Availability: Because food availability data do not account for all spoilage and waste that accumulates in the marketing system and is discarded in the home, the data typically overstate actual consumption. Food availability is adjusted for food loss, including spoilage, inedible components (such as bones in meat and pits in fruit), plate waste, and use as pet food.

The USDA likes to use unadjusted food availability data and call it “consumption” because, well: They CAN and who is going to stop them?

The USDA—and some bloggers too, I think—prefer unadjusted food availability data.  I guess they have decided that if American food manufacturers make it, then Americans MUST be eating it, loss-adjustments be damned. Our gluttony must somehow overcome our laziness, at least temporarily, as we dig the rejects and discards out of the landfills and pet dishes—how else could we get so darn fat?

I do understand the reluctance to use dietary intake data collected by NHANES, as all dietary intake data can be unreliable and problematic  (and not just the kind collected from fat people).  But I guess maybe if you’ve decided that Americans are being “highly inaccurate” about what they eat, then you figure it is okay be “highly inaccurate” right back at Americans about what you’ve decided to tell them about what they eat.  Because using food availability data and calling it “consumption” is to put it mildly, highly inaccurate, by a current difference of over 1000 calories.

On the other hand, it does sound waaaaaay more dramatic to say that Americans consumed 152 POUNDS (if only I could capitalize numbers!) per person of added sweeteners in 2000 (as it does here), than it does to say that we consumed 88 pounds per person that year (which is the loss-adjusted amount). Especially if you are intent on blaming the obesity crisis on sugar.

Which is kinda hard to do looking at the chart below.

Loss adjusted food availability:

Calories per day 1970 2010
Total 2076 2534 +458
Added fats and oils 338 562 +224
Flour and cereal products 429 596 +167
Poultry 75 158 +83
Added sugars and sweeteners 333 367 +34
Fruit 65 82 +17
Fish 12 14 +2
Butter 29 26 -3
Veggies 131 126 -5
Eggs 43 34 -9
Dairy 245 232 -13
Red meat* 349 267 -82
Plain whole milk 112 24 -88

*Red meat: beef, veal, pork, lamb

Anybody who thinks we did not change our diet dramatically between 1970 and the present either can’t read a dataset or is living in a special room with very soft bouncy walls. Why we changed our diet is still a matter of debate. Now, it is my working theory that the changes that you see above were precipitated, at least in part, by the advice given in the 1977 Dietary Goals for Americans, which was later institutionalized, despite all kinds of science and arguments to the contrary, as the first Dietary Guidelines for Americans in 1980.

Let’s see if my theory makes sense in light of the loss-adjusted food availability data above (and which I will loosely refer to as “consumption”).  The 1977 [2nd Edition] Dietary Goals for Americans say this:

#1 – Did we increase our consumption of grains? Yes. Whole? Maybe not so much, but our consumption of fiber went from 19 g per day in 1970 to 25 g per day in 2006 which is not much less than the 29 grams of fiber per day that we were consuming back in 1909 (this is from food availability data, not adjusted for loss, because it’s the only data that goes back to 1909).

The fruits and veggies question is a little more complicated. Availability data (adjusted for losses) suggests that veggie consumption went up about 12 pounds per person per year (sounds good, but that’s a little more than a whopping half an ounce a day), but that calories from veggies went down. Howzat? Apparently Americans were choosing less caloric veggies, and since reducing calories was part of the basic idea for insisting that we eat more of them, hooray on us. Our fruit intake went up by about an ounce a day; calories from fruit reflects that. So, while we didn’t increase our vegetable and fruit intake much, we did increase it. And just FYI, that minuscule improvement in veggie consumption didn’t come from potatoes. Combining fresh and frozen potato availability (adjusted for losses), our potato consumption declined ever so slightly.

#2 – Did we decrease our consumption of refined sweeteners? No. But we did not increase our consumption as much as some folks would like you to think. Teaspoons of added (caloric) sweeteners per person in our food supply (adjusted for waste) went from 21 in 1970 to 23 in 2010.  It is very possible that some people were consuming more sweeteners than other people since those numbers are population averages, but the math doesn’t work out so well if we are trying to blame added sweeteners for 2/3 of the population gaining weight.  It doesn’t matter how much you squint at the data to make it go all fuzzy, the numbers pretty much say that the amount of sweeteners in our food supply has not dramatically increased.

#3 – Did we decrease our consumption of total fat? Maybe, maybe not—depends on who you want to believe. According to dietary intake data (from our national food monitoring data, NHANES), in aggregate, we increased calories overall, specifically from carbohydrate food, and decreased calories from fat and protein. That’s not what our food supply data indicate above, but there you go.

Change in amount and type of calories consumed from 1971 to 2008
according to dietary intake data

There is general agreement , however, from both food availability data  and from intake data, that we decreased our consumption of the saturated fats that naturally occur with red meat, eggs, butter, and full-fat milk (see below), and we increased our consumption of “added fats and oils,” a category that consists almost exclusively of vegetable oils, which are predominantly polyunsaturated and which were added to foods–hence the category title–such as those inexpensive staples, grains and cereals, during processing.

#4 – Did we decrease our consumption of animal fat, and choose “meat, poultry, and fish which will reduce saturated fat intake”? Why yes, yes we did. Calories from red meat—the bearer of the dreaded saturated fat and all the curses that accompany it—declined in our food system, while poultry calories went up.

(So, I have just one itty-bitty request: Can we stop blaming the rise in obesity rates on burgers? Chicken nuggets, yes. KFC, yes. The buns the burgers come on, maybe. The fries, quite possibly. But not the burgers, because burgers are “red meat” and there was less red meat—specifically less beef—in our food supply to eat.)

Michael Pollan–ever the investigative journalist–insists that after 1977, “Meat consumption actually climbed” and that “We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.”   In the face of such a concrete and well-proven assumption, why bother even  looking at food supply data, which indicate that our protein from meat, poultry, fish, and eggs  “climbed” by just half an ounce?

In fact, there’s a fairly convenient balance between the calories from red meat that left the supply chain and the calories of chicken that replaced them. It seems we tried to get our animal protein from the sources that the Dietary Goals said were “healthier” for us.

#5 – Did we reduce our consumption of full-fat milk? Yes. And for those folks who contend this means we just started eating more cheese, well, it seems that’s pretty much what we did. However, overall decreases in milk consumption meant that overall calories from dairy fat went down.

#6 – Did we reduce our consumption of foods high in cholesterol? Yes, we did that too. Egg consumption had been declining since the relative affluence of post-war America made meat more affordable and as cholesterol fears began percolating through the scientific and medical community, but it continued to decline after the 1977 Goals.

#7 – Salt? No, we really haven’t changed our salt consumption much and perhaps that’s a good thing. But the connections between salt, calorie intake, and obesity are speculative at best and I’m not going to get into them here (although I do kinda get into them over here).

food supply and Dietary GoalsWhat I see when I look at the data is a good faith effort on the part of the American people to try to consume more of the foods they were told were “healthy,” such as grains and cereals, lean meat, and vegetable oils. We also tried to avoid the foods that we were told contained saturated fat—red meat, eggs, butter, full-fat milk—as these foods had been designated as particularly “unhealthy.” No, we didn’t reduce our sweetener consumption, but grains and cereals have added nearly 5 times more calories than sweeteners have to our food supply/intake.

Although the America of 1970 is more like the America of today than the America of 1909, some things have changed. Probably the most dramatic change between the America of the 1970s and the America of today is our food-health system. Women in the workplace, more suburban sprawl, changing demographics, increases in TV and other screen time—those were all changes that had been in the works for a long time before the 1977 Dietary Goals came along. But the idea that meat and eggs were “bad” for you? That was revolutionary.

And the rapid rises in obesity and chronic diseases that accompanied these changes? Those were pretty revolutionary as well.

One of my favorite things to luck upon on a Saturday morning in the 70s—aside from the Bugs Bunny-does-Wagner cartoon, “What’s Opera, Doc?“—were the public service announcements featuring Timer, an amorphous yellow blob with some sing-along information about nutrition:

You are what you eat

From your head down to your feet

Thinks like meat and eggs and fish you

Need to build up muscle tissue

Hello appetite control?

More protein!

Meat and eggs weren’t bad for you. They didn’t cause heart disease. You needed them to build up muscle tissue and to keep you from being hungry!

But in 1984, when this showed up on the cover of Time magazine (no relation to Timer the amorphous blob), I—along with a lot of other Americans—was forced to reconsider what I’d learned on those Saturday morning not that long ago:

My all-time favorite Timer PSA was this one:

When my get up and go has got up and went,

I hanker for a hunk of cheese.

When I’m dancing a hoedown

And my boots kinda slow down,

Or any time I’m weak in the knees . . .

I hanker for a hunk of

A slab or slice or chunk of–

A snack that is a winner

And yet won’t spoil my dinner–

I hanker for hunk of CHEESE!

In the 80s, when I took up my low-fat, vegetarian ways, I would still hanker for a hunk of cheese, but now I would look for low-fat, skim, or fat-free versions—or feel guilty about indulging in the full-fat versions that I still loved.

I’m no apologist for the food industry; such a dramatic change in our notions about “healthy food” clearly required some help from them, and they appear to have provided it in abundance.  And I’m not a fan of sugar-sweetened beverages or added sweeteners in general, but dumping the blame for our current health crisis primarily on caloric sweeteners is not only not supported by the data at hand, it frames the conversation in a way that works to the advantage of the food industry and gives our public health officials a “get out of jail free card”  for providing 35 years worth of lousy dietary guidance.

Next time on As the Calorie Churns, we’ll explore some of the interaction between consumers, industry, and public health nutrition recommendations. Stay tuned for the next episode, when you’ll get to hear Adele say: “Pollanomics: An approach to food economics that is sort of like the Field of Dreams—only with taco-flavored Doritos.”

The NaCl Debacle Part 2: We don’t need no stinkin’ science!

Sodium-Slashing Superheroes Low-Sodium Larry and his bodacious side-kick Linda “The Less Salt the Better” Van Horn team up to protect Americans from the evils lurking in a teaspoon of salt!
(Drawings courtesy of Butcher Billy)

Yesterday, we found our Sodium-Slashing Superheroes Larry and Linda determined to make sure that no American endangered his/her health by ingesting more than ¾ of a teaspoon of salt a day. But recently, an Institute of Medicine report determined that recommendations to reduce sodium intake to such low levels provided no health benefits and could be detrimental to the health of some people. [In case you missed it and your job is really boring, you can read Part 1 of the NaCl Debacle here.]

Our story picks up as the 2010 USDA/HHS Dietary Guidelines Advisory Committee, fearlessly led by Linda and Larry, arrives at the foregone conclusion that most, if not all, US adults would (somehow) benefit from reducing their sodium intake to 1500 mg/day.  The American Heart Association, in a report written by—surprise!—Larry and Linda, goes on to state that “The health benefits [of reducing sodium intake to 1500 mg/day] apply to Americans in all groups, and there is no compelling evidence to exempt special populations from this public health recommendation.”

Does that mean there is “compelling evidence” to include special populations, or for that matter ordinary populations, in this 1500 mg/day recommendation? No, but who cares?

Does that mean there is science to prove that “excess” sodium intake (i.e. more than ¾ of a teaspoon of salt a day) leads to high blood pressure and thus cardiovascular disease, or that salt makes you fat, or that sodium consumption will eventually lead to the zombie apocalypse? No, no, and no—but who cares?

Larry and Linda KNOW that salt is BAD. Science? They don’t need no stinkin’ science.

Because the one thing everyone seems to be able to agree on is that the science on salt does indeed stink. The IOM report has had to use many of the same methodologically-flawed studies available to the 2010 Dietary Guidelines Advisory Committee, full of the same confounding, measurement error, reverse causation and lame-ass dietary assessment that we know and love about all nutrition epidemiology studies.  But the 2010 Dietary Guidelines Advisory Committee didn’t actually bother to look at these studies.

Why not?  (And let me remind you that the Dietary Guidelines folks usually <heart> methodologically-flawed study designs, full of confounding, measurement error, reverse causation and lame-ass dietary assessment.)

First, a little lesson in how the USDA/HHS folks create dietary guidance meant to improve the health and well-being of the American people:

  1. Take a clinical marker, whose health implications are unclear, but whose levels we can measure cheaply and easily (like blood pressure, cholesterol, weight).
  2. Suggest that this marker—like Karnac the Magnificent—can somehow predict risk of a chronic disease whose origins are multiple and murky (like obesity, heart disease, cancer).
  3. Use this suggestion to establish some arbitrary clinical cut offs for when this marker is “good” and “bad.” (Note to public health advocacy organizations: Be sure to frequently move those goalposts in whichever direction requires more pharmaceuticals to be purchased from the companies that sponsor you.)
  4. Find some dietary factor that can easily and profitably be removed from our food supply, but whose intake is difficult to track (like saturated fat, sodium, calories).
  5. Implicate the chosen food factor in the regulation of the arbitrary marker, the details of which we don’t quite understand. (How? Use observational data—see methodological flaws above—but hunches and wild guesses will also work.)
  6. Create policy that insists that the entire population—including people who, by the way, are not (at least at this point) fat, sick or dead—attempt to prevent this chronic disease by avoiding this particular dietary factor. (Note to public health advocacy organizations: Be sure to offer food manufacturers the opportunity to have the food products from which they have removed the offensive component labeled with a special logo from your organization—for a “small administrative fee,” of course.)
  7. Commence collecting weak, inconclusive, and inconsistent data to prove that yes indeedy this dietary factor we can’t accurately measure does in fact have some relationship to this arbitrary clinical marker, whose regulation and health implications we don’t fully understand.
  8. Finally—here’s the kicker—measure the success of your intervention by whether or not people are willing to eat expensive, tasteless, chemical-filled food devoid of the chosen food factor in order to attempt to regulate the arbitrary clinical marker.
  9. Whatever you do, DO NOT EVER measure the success of your intervention by looking at whether or not attempts to follow your intervention has made people fat, sick, or dead in the process.
  10. Ooops. I think I just described the entire history of nutrition epidemiology of chronic disease.

Blood pressure is easy to measure, but we don’t always know what causes it to go up (or down). There is no real physiological difference between having a blood pressure reading of 120/80, which will get you a diagnosis of “pre-hypertension” and a fistful of prescriptions, and a reading of 119/79, which won’t.  Blood pressure is not considered to be a “distinct underlying cause of death,” which means that, technically, no one ever dies of blood pressure (high or low). We certainly don’t know how to disentangle the effects of lowering dietary sodium on blood pressure from other effects (like weight loss) that may be related to dietary changes that are a part of an attempt to lower sodium (and we have an embarrassingly hard time collecting accurate dietary intake information from Food Fantasy Questionnaires anyway). We also know that individual response to sodium varies widely.

So doesn’t it make perfect sense that the folks at the USDA/HHS should ignore science that investigates the relationship between sodium intake and whether or not a person stayed out of the hospital, had a heart attack, or up and died? Well, it doesn’t to me, but nevertheless the USDA/HHS has remained obsessively fixated on one thing and one thing only, what effects reducing sodium has on blood pressure,  and they pay not one whit of attention to what effects reducing sodium has on, say, aliveness.

So let’s just get this out there and agree to agree: reducing sodium in most cases will reduce blood pressure.  But then, just to be clear, so will dismemberment, dysentery, and death.  We can’t just assume that lowering sodium will only affect blood pressure or will only positively affect health (I mean, we can’t unless we are Larry or Linda). Recent research, which prompted the IOM review, indicates that reducing sodium will also increase triglyceride levels, insulin resistance, and sympathetic nervous system activity. For the record, clinicians generally don’t consider these to be good things.

This may sound radical but in their review of the evidence, the IOM committee decided to do a few things differently.

First, they gave more weight to studies that determined sodium intake levels through multiple high-quality 24-hour urine collections. Remember, this is Low-Sodium Larry’s favorite way of estimating intake.

Also, they did not approach the data with a predetermined “healthy” range already established in their brains. Because of the extreme variability in intake levels among population groups, they decided to—this is crazy, I know—let the outcomes speak for themselves.

Finally, and most importantly, in the new IOM report, the authors, unlike Larry and Linda, focused on—hold on to your hats, folks!—actual health outcomes, something the Dietary Guidelines Have. Never. Done. Ever.

The IOM committee found, in a nutshell:

“that evidence from studies on direct health outcomes is inconsistent and insufficient to conclude that lowering sodium intakes below 2,300 mg per day either increases or decreases risk of CVD outcomes (including stroke and CVD mortality) or all-cause mortality in the general U.S. population.”

In other words, there is no science to indicate that we all need to be consuming less than ¾ of a teaspoon of salt a day. Furthermore, while there may be some subpopulations that may benefit from sodium reduction, reducing sodium intake to 1500 mg/day may increase risk of adverse health outcomes for people with congestive heart failure, diabetes, chronic kidney disease, or heart disease. (If you’d like to wallow in some of the studies reviewed by the IOM, I’ve provided the Reader’s Digest Condensed Version at the bottom of the page.)

Of course, the American Heart Association, eager to provide the public with the most up-to-date recommendations about heart health as long as they don’t contradict outdated recommendations of which the AHA is fond, responded to the IOM report by saying, “The American Heart Association is not changing its position. The association rejects the Institute of Medicine’s conclusions because the studies on which they were based had methodological flaws.”

Um, hello AHA? Exactly what completely non-existent, massive, highly-controlled and yet highly-generalizable randomized controlled trials about sodium intake and health effects were you planning on using to make your case? I believe it was the AHA that mentioned that “It is well-known, however, that such trials are not feasible because of logistic, financial, and often ethical considerations.” Besides, I don’t know what the AHA is whining about. The quality of the science hardly matters if you are not going to pay any attention to it in the first place.

No, folks that giant smacking sound you hear is not my head on my keyboard. That was the sound of science crashing into a giant wall of Consistent Public Health Message. Apparently, those public health advocates at the AHA seem to think that changing public health messages—even when they are wrong—confuses widdle ol’ Americans. The AHA—and the USDA/HHS team—doesn’t want us to have to worry our pretty little heads about all that crazy scientifical stuff with big scary words and no funny pictures or halftime shows.

Frankly, I appreciate that. I hate to have my pretty little head worried. But there’s one other problem with this particular Consistent Public Health Message. Not only is there no science to back it up; not only is it likely to be downright detrimental to the health of certain groups of people; not only is it likely to introduce an arsenal of synthetic chemical salt-replacements that will be consumed at unprecedented levels without testing for negative interactions or toxicities (remember how well that worked out when we replaced saturated fat with partially-hydrogenated vegetable oils?)—it is, apparently, incompatible with eating food.

Researchers set out to find what would really happen if Americans were muddle-headed and sheep-like enough to actually try to reduce their sodium intake to 1500 mg/day. They discovered that, “the 2010 Dietary Guidelines for sodium were incompatible with potassium guidelines and with nutritionally adequate diets, even after reducing the sodium content of all US foods by 10%.”  Way to go, Guidelines

While these researchers suggested that a feasibility study (this is a scientifical term for “reality check”) should precede the issuing of dietary guidelines to the public, I have a different suggestion.

How about we just stop with the whole 30-year-long dietary experiment to prevent chronic disease by telling Americans what not to eat? I hate to be the one to point this out, but it doesn’t seem to be working out all that well.  It’s hard to keep assuming that the AHA and the USDA/HHS mean well when, if you look at it for what it is, they are willing to continue to jeopardize the health of Americans just so they don’t have to admit that they might have been wrong about a few things.  I suppose if a Consistent Public Health Message means anything, it means never having to say you’re sorry for 30 years-worth of lousy dietary advice.

Marion Nestle has noted that, up until now, “every single committee that has dealt with this question [of sodium-reduction] says, ‘We really need to lower the sodium in the food supply.’ Now either every single committee that has ever dealt with this issue is delusional, which I find hard to believe—I mean they can’t all be making this up—[or] there must be a clinical or rational basis for the unanimity of these decisions.”

Weeeell, I got some bad news for you, Marion. Believe it. They have been delusional. They are making this up. And no, apparently there is no clinical or rational basis for the unanimity of these decisions.

But, thanks to the IOM report, perhaps we can no longer consider these decisions to be unanimous.

Praise the lard and pass the salt.

Read ‘em and weep:  The Reader’s Digest Condensed Version of the science from the IOM report.  Studies marked with an asterix (*) are studies that were available to the 2010 Dietary Guidelines Advisory Committee.  

Studies that looked at Cardiovascular Disease, Stroke, and Mortality

*Cohen et al. (2006)

When intakes of sodium less than 2300 mg per day were compared to intakes greater than 2300 mg per day, the “lower sodium intake was statistically significantly associated with increased risk of all-cause mortality.”

*Cohen et al. (2008)

When a fully-adjusted (for confounders) model was used, “there was a statistically significant higher risk of CVD mortality with the lowest vs. the highest quartile of sodium intake.”

Gardener et al. (2012)

Risk of stroke was positively related to sodium intake when comparing the highest levels of intake to the lowest levels of intake. There was no statistically significant increase in risk for those consuming between 1500 and 4000 mg of sodium per day.

*Larsson et al. (2008)

“The analyses found no significant association between dietary sodium intake and risk of any stroke subtype.”

*Nagata et al. (2004)

“Among men, a 2.3-fold increased risk of stroke mortality was associated with the highest tertile of sodium intake.” That sounds bad, but the average sodium intake in the high-risk group was 6613 mg per day. The lowest risk group had an average intake of 4070 mg per day. “Thus, the average sodium intake in the US would be within the lowest tertile of this study.”

Stolarz-Skrzypek at al. (2011)

“Overall, the authors found that lower sodium intake was associated with higher CVD mortality.”

Takachi et al. (2010)

The authors found “a significant positive association between sodium consumption at the highest compared to the lowest quintile and risk of stroke.” As with the Nagata (2004) study, this sounds bad, but the average sodium intake in the high-risk group was 6844 mg per day. The lowest risk group had an average intake of 3084 mg per day. “Thus, the average sodium intake in the US would be close to the lowest quintile of this study.”

*Umesawa et al. (2008)

“The authors found an association between greater dietary sodium intake and greater mortality from total stroke, ischemic stroke, and total CVD.” However, as with the Nagata and the Takchi studies (above), lower quintiles—in this case, quintiles one and two—would be comparable to average US intake.

Yang et al. (2011)

Higher usual sodium intake was found to be associated with all-cause mortality, but not cardiovascular disease mortality or ischemic heart disease mortality. “However, the finding that correction for regression dilution increased the effect on all-cause mortality, but not on CVD mortality, is inconsistent with the theoretical causal pathway.”  In other words, high sodium intake might be bad for health, but not because it raises blood pressure and leads to heart disease.

Studies in Populations 51 Years of Age or Older

*Geleijnse et al. (2007)

“This study found no significant difference between urinary sodium level and risk of CVD mortality or all-cause mortality.” Relative risk was lowest in the medium intake group, with an average estimated intake of 2, 415 mg/day.


“Five of the nine reported studies in the general population listed above also analyzed the data on health outcomes by age and found no interaction (Cohen et al., 2006, 2008; Cook et al., 2007; Gardener et al., 2012; Yang et al., 2011).”

Studies in Populations with Chronic Kidney Disease

Dong et al. (2010)

“The authors found that the lowest sodium intake was associated with increased mortality risk.”

Heerspink et al. (2012)

“Results from this study suggest that ARBs were more effective at decreasing CKD progression and CVD when sodium intake was in the lowest tertile” which had an estimated average sodium intake of about 2783 mg/day.

Studies on Populations with Cardiovascular Disease

Costa et al. (2012)

“Dietary sodium intake was estimated from a 62-itemvalidated FFQ. . . . Significant correlations were found between sodium intake and percentage of fat and calories in daily intake. . . . Overall, for the first 30 days and up to 4 years afterward, total mortality was significantly associated with high sodium intake.”

Kono et al. (2011)

“Cumulative risk analysis found that a salt intake of greater than the median of 4,000 mg of sodium) was associated with higher stroke recurrence rate. Univariate analysis of lifestyle management also found that poor lifestyle, defined by both high salt intake and low physical activity, was significantly associated with stroke recurrence.

O’Donnell et al. (2011)

“For the composite outcome, multivariate analysis found a U-shaped relationship between 24-hour urine sodium and the composite outcome of CVD death, MI, stroke, and hospitalization for CHF.” In other words, both higher (>7,000 mg per day estimated intake) and lower (<2,990 mg per day estimated intake) intakes of sodium were associated with increased risk of heart disease and mortality.

Studies on Populations with Prehypertension

*Cook et al. (2007)

In a randomized trial comparing a low sodium intervention with usual intake, lower sodium intake did not significantly decrease risk of mortality or heart disease events.

*Cook et al. (2009)

No significant increase in risk of adverse cardiovascular outcomes was associated with increased sodium excretions levels.


“Several other studies discussed in this chapter analyzed data on health outcomes by blood pressure and found no statistical interactions (Cohen et al., 2006, 2008; Gardener et al., 2012; O’Donnell et al., 2011; Yang et al., 2011).”

Studies on Populations with Diabetes

Ekinci et al. (2011)

Higher sodium intakes were associated with decreased risk of all-cause mortality and heart disease mortality.

Tikellis et al. (2013)

“Adjusted multivariate regression analysis found urinary sodium excretion was associated with incident CVD, with increased risk at both the highest [> 4,401 mg/day] and lowest [<2,346 mg/day] urine sodium excretion levels. When analyzed as independent outcomes, no significant associations were found between urinary sodium excretion and new CVD or stroke after adjustment for other risk factors.”


“Two other studies discussed in this chapter analyzed the data on health outcomes by diabetes prevalence and found no interaction (Cohen et al., 2006; O’Donnell et al., 2011).”

Studies in Populations with Congestive Heart Failure

Arcand et al. (2011)

High sodium intake levels (≥2,800 mg per day) were significantly associated with acute decompensated heart failure, all-cause hospitalization, and mortality.

Lennie et al. (2011)

“Results for event-free survival at a urinary sodium of ≥3,000 mg per day varied by the severity of patient symptoms.” In people with less severe symptoms, sodium intake greater than 3,000 mg per day was correlated with a lower disease incidence compared to those with a sodium intake less than 3,000 mg per day. Conversely, people with more severe symptoms who had a sodium intake greater than 3,000 mg per day had a higher disease incidence than those with sodium intakes less than 3,000 mg per day.

Parrinello et al. (2009)

“During the 12 months of follow-up, participants receiving the restricted sodium diet [1840 mg/day] had a greater number of hospital readmissions and higher mortality compared to those on the modestly restricted diet [2760 mg/day].”

*Paterna et al. (2008)

The lower sodium intake group [1840 mg/day] experienced a significantly higher number of hospital readmissions compared to the normal sodium intake group [2760 mg/day].

*Paterna et al. (2009)

A significant association was found between the low sodium intake [1,840 mg per day]) and hospital readmissions. The group with normal sodium diet [2760 mg/day] also had fewer deaths compared to all groups receiving a low-sodium diet combined.

The NaCl Debacle Part 1: Salt makes you fat?

Don’t look now, but I think the Institute of Medicine’s new report on sodium just bitch-slapped the USDA/HHS 2010 Dietary Guidelines.

In case you have a life outside of the nutritional recommendation roller derby, the IOM recently released a report that comes to the conclusion that restricting sodium intake to 1500 mg/day may increase rather than reduce health risks. Which is a little weird, since the 2010 Dietary Guidelines did a great job of insisting that any American with high blood pressure, all blacks, and every middle-aged and older adult—plus anyone who has ever eaten bacon or even thought about eating bacon, i.e. nearly everybody—should limit their salt intake to 1500 mg of sodium a day, or less than ¾ of a teaspoon of salt. The American Heart Association was, of course, aghast. The AHA thinks EVERYBODY should be limited to less than ¾ teaspoon of salt a day, including people who wouldn’t even think about thinking about bacon.

Why are the AHA and USDA/HHS so freaked out about salt?  And how did the IOM reach such a vastly different conclusion than that promoted by the AHA and the Dietary Guidelines?  Fasten your seat belts folks, it’s gonna be a bumpy blog.

First, it is helpful to examine why the folks at AHA and USDA/HHS are so down on salt.  The truth: we have no freakin’ idea. Salt has been around since what, the dawn of civilization maybe? It is an essential nutrient, and it plays an important role in preserving food and preventing microbial growth (especially on bacon). But Americans could still be getting too much of a good thing. Everybody at the AHA seems to think that Americans consume “excessive amounts” of sodium. (Of course, just about anything looks excessive compared to less than ¾ of a teaspoon.) But do we really consume too much sodium?

Back in 2010, Dr. Laurence I-Know-More-About-Sodium-Than-Your-Kidneys-Do Appel (or as his friends call him, “Low-Sodium Larry”), one of the leading advocates for a salt-free universe, acknowledged that “The data is quite murky. We just don’t have great data on sodium trends over time. I wish that we did. But I can’t tell you if there’s been an increase or decrease.”

Well, Low-Sodium Larry, I can, and I am about to make your wish come true.

According to recent research done by that wild bunch of scientific renegades at Harvard, in the past 60 years sodium intake levels have . . .drumroll, please . . .  not done much of anything.

Hey, that doesn’t sound right! Everyone knows that it is virtually impossible to get an accurate measure of sodium intake from dietary questionnaires; people are probably just “under-reporting” their salt intake like they “under-report” everything else. Low-Sodium Larry has previously insisted that one of the reasons the data is so murky is that few epidemiological studies measure sodium intake accurately and that, “really, you should do 24-hour urinary sodium excretions to do it right.”

The guys at Harvard looked at studies that did it right.  This systematic analysis of 38 studies from the 1950s to the present, found that 24-hour urinary sodium excretion (the “gold” standard—omg, I could not resist that—of dietary sodium intake estimation) has neither increased nor decreased, but has remained essential stable over time. Despite the fact that Americans are apparently hoovering up salt like Kim Kardashian hoovers up French fries—and with much the same results, i.e. puffing up like a Macy’s Thanksgiving Day balloon—for whatever reason we simply aren’t excreting more of it in our urine.

According to that same study however, despite the lack of increase in sodium excretion (which is supposed to accurately reflect intake—but that can’t be right), high blood pressure rates in the population have been increasing. Duh. Everyone knows that eating lots of salt makes your blood pressure go up. But have the rates of high blood pressure in America really been going up?

Age-Adjusted Prevalence of Hypertension (2009 NIH Chart Book)

Well, no.  Not really. The Harvard dudes cite a report that goes back to 1988-1994 data, and yes, rates of high blood pressure have been creeping slowly back up since then. This is because from 1976-1980 to 1988-1994, rates of high blood pressure plummeted for most segments of the American population.

We don’t know why rates of high blood pressure fell during the 70s and early 80s. It may have been that the Dietary Guidelines told people to eat more potassium-filled veggies and people actually tried to follow the Dietary Guidelines, which would have had a positive effect on high blood pressure. On the other hand, it could have been largely due to the sedating influence of the soft rock music of that era blanketing the airwaves with the mellow tones of England Dan and John Ford Coley, Christopher Cross, Ambrosia, and the like (youtube it, you young whippersnappers out there). We also don’t know why rates are going back up. Rising rates of obesity may be part of the problem, but it is also entirely possible that piping the Monsters of Lite Rock through every PA system in the country might save our health care system a lot of time and trouble.

This is what we (think we) know:

  • High-sodium diets might possibly maybe sometimes be a contributor to high blood pressure.
  • Rates of high blood pressure are going (back) up.
  • Obesity rates are definitely going up.

Ergo pro facto summa cum laude, it is clear—using the logic that seems to undergird the vast majority of our public health nutrition recommendations—salt makes you fat.  The USDA/HHS has been faced with rapidly rising rates of obesity which, until now, they have only been to pin on the laziness and gluttony of Americans.  But if salt makes us fat, that might explain why the USDA/HHS doesn’t want us to eat it.

After all, the biomechanics of this is pretty straightforward. If you eat too much sodium (which we must be), but you don’t pee it out (which we aren’t), you must be retaining it and this is what makes your blood pressure and your weight both go way up. They didn’t really cover the physics of this in my biochemistry classes so you’ll have to ask Dr. Appel how this works because he knows more about sodium than your kidneys do. But I think it must be true. After all, this is the mechanism that explains the weight loss behind carbohydrate-reduced diets, right? I myself reduced my carb intake and lost 60 pounds of water weight!

And besides, taking the salt out of our food will give food manufacturers the opportunity to make food more expensive and tasteless while adding synthetic ingredients whose long-term effects are unknown—just what the American consumer wants!

For a while there, we thought the whole idea was to reduce sodium in order to reduce blood pressure in order to reduce diseases of the circulatory system, like heart failure, stroke, and coronary heart disease . That didn’t seem to work out so well, because the whole time that sodium intake was staying stable (if we want to believe the urinary sodium excretion data) and high blood pressure rates were going down (although they are starting to go back up), rates of those diseases have gone up:

Age-Adjusted Prevalence of Heart Failure (2009 NIH Chart Book)

Age-Adjusted Prevalence of Stroke (2009 NIH Chart Book)

Age-Adjusted Prevalence of Coronary Heart Disease (2007 NIH Chart Book)

So if reducing blood pressure to reduce cardiovascular disease isn’t the answer, then we must need to reduce blood pressure to reduce obesity! By jove, I think we’ve got it!

The USDA/HHS must have known the “salt makes you fat” notion would be a tough sell, I mean, what with the lack of any shred of supporting science and all that. (But then, the “salt causes high blood pressure which causes cardiovascular disease” argument hasn’t exactly been overburdened by evidence either, and that never seemed to stop anyone.) So the 2010 Dietary Guidelines brought together the American Heart Association’s Superheroes of Sodium Slashing, Low-Sodium Larry and his bodacious salt-subduing sidekick, Linda Van Horn, both of whom had been preaching the gospel of sodium-reduction as a preventive health measure with little conclusive evidence to support their recommendations.  The USDA/HHS knew that with Linda and Larry on the team, it didn’t matter how lame the science, how limited the data, or how ludicrous the recommendation, these two could be counted on to review any and all available evidence and reliably come up with the exact same concrete and well-proven assumptions they’d been coming up with for years.

The Sodium-Slashing Superheroes–Drs. Lawrence Appel and Linda Van Horn– ready to make the world safe for bland, unappetizing food everywhere! (Drawings courtesy of Butcher Billy)

So here’s the cliffhanger:  Will Linda and Larry be able to torture the science on salt into confessing its true role in the obesity crisis?

Tune in tomorrow, when you’ll hear Linda and Larry say: “Science? We don’t need no stinkin’ science.”

Not Just Science: How nutrition got stuck in the past

Nostalgia for a misremembered past is no basis for governing a diverse and advancing nation.

David Frum

The truth is that I get most of my political insight from Mad Magazine; they offer the most balanced commentary by far. However, I’ve been very interested in the fallout from the recent election, much more so than I was in the election itself; it’s like watching a Britney Spears meltdown, only with power ties. I kept hearing the phrase “epistemic closure” and finally had to look it up. Now, whether or not the Republican party suffers from it, I don’t care (and won’t bother arguing about), but it undeniably describes the current state of nutrition. “Epistemic closure” refers to a type of close-mindedness that precludes any questioning of the prevailing dogma to the extent that the experts, leaders, and pundits of a particular paradigm:

“become worryingly untethered from reality”

“develop a distorted sense of priorities”

and are “voluntarily putting themselves in the same cocoon”

Forget about the Republicans. Does this not perfectly describe the public health leaders that are still clinging blindly to the past 35 years of nutritional policy?  The folks at USDA/HHS live in their own little bubble, listening only to their own experts, pretending that the world they live in now can be returned to an imaginary 1970s America, where children frolicked outside after downing a hearty breakfast of sugarless oat cereal and grown-ups walked to their physically-demanding jobs toting homemade lunches of hearty rye bread and shiny red apples.

Remember when all the families in America got their exercise playing outside together—including mom, dad, and the maid? Yeah, me neither.

So let me rephrase David Frum’s quote above for my own purposes: Nostalgia for a misremembered past is no basis for feeding a diverse and advancing nation.

If you listen to USDA/HHS, our current dietary recommendations are a culmination of science built over the past 35 years on the solid foundation of scientific certainty translated into public health policy. But this misremembered scientific certainty wasn’t there then and it isn’t here now; the early supporters of the Guidelines were very aware that they had not convinced the scientific community that they had a preponderance of evidence behind them [1]. Enter the first bit of mommy-state* government overreach. When George McGovern’s (D) Senate Select Committee came up with the 1977 Dietary Goals for Americans, it was a well-meaning approach to not only reduce chronic disease, a clear public health concern, but to return us all to a more “natural” way of eating. This last bit of ideology reflected a secular trend manifested in the form of the Dean Ornish-friendly Diet for a Small Planet, a vegetarian cookbook that smushed the humanitarian and environmental concerns of meat-eating in with some flimsy nutritional considerations, promising that a plant-based diet was the best way to feed the hungry, save the planet, safeguard your health, and usher in the Age of Aquarius.  This was a pop culture warm-fuzzy with which the “traditional emphasis on the biochemistry of disease” could not compete [2].

If you listen to some folks, the goofy low-fat, high-carb, calories in-calories out approach can be blamed entirely on this attempt of the Democrats to institutionalize food morality. But, let’s not forget that the stage for the Dietary Guidelines fiasco was set earlier by Secretary of Agriculture Earl Butz, an economist with many ties to large agricultural corporations who was appointed by a Republican president. He initiated the “fencerow to fencerow” policies that would start the shift of farm animals from pastureland to feed lots, increasing the efficiency of food production because what corn didn’t go into cows could go into humans, including the oils that were a by-product of turning crops into animal feed.

When Giant Agribusiness—they’re not stupid, y’know—figured out that industrialized agriculture had just gotten fairydusted with tree-hugging liberalism in the form of the USDA Guidelines, they must have been wetting their collective panties. The oil-refining process became an end in itself for the food industry, supported by the notion that polyunsaturated fats from plants were better for you than saturated fats from animals, even though evidence for this began to appear only after the Guidelines were already created and only through the status quo-confirming channels of nutrition epidemiology, a field anchored solidly in the crimson halls of Harvard by Walter Willett himself.

Between Earl Butz and McGovern’s “barefoot boys of nutrition,” somehow corn oil from refineries like this became more “natural” than the fat that comes, well, naturally, from animals.

And here we are, 35 years later, trying to untie a Gordian knot of weak science and powerful industry cemented together by the mutual embarrassment of both political orientations. The entrenched liberal ivory-tower interests don’t want look stupid by having to admit that the 3 decades of public health policy they created and have tried to enforce have failed miserably. The entrenched big-business-supporting conservative interests don’t want to look stupid by having to admit that Giant Agribusiness, whose welfare they protect, is now driving up government spending on healthcare by acting like the cigarette industry did in the past and for much the same reasons.

These overlapping/competing agendas have created the schizophrenic, conjoined twins of a food industry-vegatarian coalition, draped together in the authority of government policy. Here the vegans (who generally seem to be politically liberal rather than conservative, although I’m sure there are exceptions) play the part of a vocal minority of food fundamentalists whose ideology brooks no compromise. (I will defend eternally the right for a vegan–or any fundamentalist–to choose his/her own way of life; I draw the line at having it imposed on anyone else–and I squirm a great deal if someone asks me if that includes children.)  The extent to which vegan ideology and USDA/HHS ideology overlap has got to be a strange bedfellow moment for each, but there’s no doubt that the USDA/HHS’s endorsement of vegan diets is a coup for both. USDA/HHS earns a politically-correct gold star for their true constituents in the academic-scientific-industrial complex, and vegans get the nutritional stamp of approval for a way of eating that, until recently, was considered by nutritionists to be inadequate, especially for children.

Like this chicken, the USDA/HHS loves vegans—at least enough to endorse vegan diets as a “healthy eating pattern.”

But if the current alternative nutrition movement is allegedly representing the disenfranchised eaters all over America who have been left out of this bizarre coalition, let us remember that, in many ways, the “alternative” is really just more of the same. The McGovern hippies gave us “eat more grains and cereals, less meat and fat,” now the Republican/Libertarian-leaning low-carb/primaleo folks have the same idea only the other way around—and with the same justification.  “Eat more meat and fat, fewer grains and cereals;” it’s a more “natural” way to eat.

As counterparts to the fundamentalist vegans, we have the Occupy Wall street folks of the alternative nutrition community—raw meaters who sleep on the floor of their caves and squat over their compost toilets after chi running in their Vibrams. They’re adorably sincere, if a little grubby, and they have no clue how badly all the notions they cherish would get beaten in a fight with the reality of middle-Americans trying to make it to PTA meeting.

How paleo might look from the outside.

To paraphrase David Frum again, the way forward in food-health reform is collaborative work, and although we all have our own dietary beliefs, food preferences, and lifestyle idiosyncrasies  the immediate need is for a plan with just this one goal: we must emancipate ourselves from prior mistakes and adapt to contemporary realities.

Because the world in which we live is not the Brady Bunch world that the many of us in nutrition seem to think it is.

Frum makes the point that in 1980, when the Dietary Guidelines were first officially issued from the USDA, this was still an overwhelmingly white country. “Today, a majority of the population under age 18 traces its origins to Latin America, Africa, or Asia. Back then, America remained a relatively young country, with a median age of exactly 30 years. Today, over-80 is the fastest-growing age cohort, and the median age has surpassed 37.” Yet our nutrition recommendations have not changed from those originally created on a weak science base of studies done on middle-aged white people. To this day, we continue to make nutrition policy decisions on outcomes found in databases that are 97% white. The food-health needs of our country are far more diverse now, culturally and biologically. And another top-down, one-size-fits-all approach from the alternative nutrition community won’t address that issue any more adequately than the current USDA/HHS one.

For those who think the answer is to “just eat real food,” here’s another reality check: “In 1980, young women had only just recently entered the workforce in large numbers. Today, our leading labor-market worry is the number of young men who are exiting.” That means that unless these guys are exiting the workforce to go home and cook dinner, the idea that the solution to our obesity crisis lies in someone in each American household willingly taking up the mind-numbingly repetitive and eternally thankless chore of putting “real food” on the table for the folks at home 1 or more times a day for years on end—well, it’s as much a fantasy as Karl Rove’s Ohio outcome.

David Frum points out that “In 1980, our top environmental concerns involved risks to the health of individual human beings. Today, after 30 years of progress toward cleaner air and water, we must now worry about the health of the whole planetary climate system.” Today, our people and our environment are both sicker than ever. We can point our fingers at meat-eaters, but saying we now grow industrialized crops in order to feed them to livestock is like saying we drill for oil to make Vaseline. The fact that we can use the byproducts of oil extraction to make other things—like Vaseline or livestock feed—is a happy value-added efficiency in the system, no longer its raison d’etre. Concentrated vertical integration has undermined the once-proud tradition of land stewardship in farming. Giving this power back to farmers means taking some power away from Giant Agribusiness, and neither party has the political will to do that, especially when together they can demonize  livestock-eating while promoting corn oil refineries.

If we all just stopped eating meat, then we wouldn’t have to plant so much corn, right? Right?

And it’s not just our food system that has changed: “In 1980, 79 percent of Americans under age 65 were covered by employer-provided health-insurance plans, a level that had held constant since the mid-1960s. Back then, health-care costs accounted for only about one 10th of the federal budget. Since 1980, private health coverage has shriveled, leaving some 45 million people uninsured. Health care now consumes one quarter of all federal dollars, rapidly rising toward one third—and that’s without considering the costs of Obamacare.”  That the plant-based diet that was institutionalized by liberal forces and industrialized by conservative ones is a primary part of this enormous rise in healthcare costs is something no one on either side of the table wants to examine. Diabetes—the symptoms of which are fairly easily reversed by a diet that excludes most industrialized food products and focuses on meat, eggs, and veggies—is the nightmare in the closet of both political ideologies.

David Frum quotes the warning from  British conservative, the Marquess of Salisbury, “The commonest error in politics is sticking to the carcass of dead policies.”

Right now, it is in the best interest of both parties to stick to our dead nutrition policies and dump the ultimate blame on the individuals (we gave you sidewalks and vegetable stands–and you’re still fat! cry the Democrats; we let the food industry have free reign so you could make your own food choices–and you’re still fat! cry the Republicans). It’s a powerful coalition, resistant to change no matter who is in control of the White House or Congress.

What can be done about it, if anything? To paraphrase Frum once again, a 21st century food-health system must be economically inclusive, environmentally responsible, culturally modern, and intellectually credible.

We can start the process by stopping with the finger-pointing and blame game, shedding our collective delusions about the past and the present, and recognizing the multiplicity of concerns that must be addressed in our current reality. Let’s begin by acknowledging that—for the most part—the people in the spotlight on either side of the nutrition debate don’t represent the folks most affected by federal food-health policies. It is our job as leaders, in any party and for any nutritional paradigm, to represent those folks first, before our own interests, funding streams, pet theories, or personal ideologies. If we don’t, each group—from the vegatarians to folks at Harvard to the primaleos—runs the risk of suffering from its own embarrassing form of epistemic closure.

Let’s quit bickering and get to work.


*This was too brilliant to leave buried in the comments section:

“Don’t you remember the phrase “wait til your father gets home”? You want to know what the state is? It’s Big Daddy. Doesn’t give a damn about the day to day scut, just swoops in to rescue when things get out of hand and then takes all the credit when the kids turn out well, whether it’s deserved or not. Equates spending money with parenting, too.”–from Dana

So from henceforth, all my “mommy-state” notions are hereby replaced with “Big Daddy,” a more accurate and appropriate metaphor.  And I never metaphor I didn’t like.


1. See Select Committee on Nutrition and Human Needs of the United States Senate. Dietary Goals for the United States. 2nd ed. Washington, DC: US Government Printing Office; 1977b. Dr. Mark Hegsted, Professor of Nutrition at Harvard School of Public Health and an early supporter of the 1977 Goals, acknowledged their lack of scientific support at the press conference announcing their release: “There will undoubtedly be many people who will say we have not proven our point; we have not demonstrated that the dietary modifications we recommend will yield the dividends expected . . . “

2. Broad, WJ. Jump in Funding Feeds Research on Nutrition. Science, New Series, Vol 204. No. 4397 (June 8, 1979). Pp. 1060-1061 + 1063-1064. In a series of articles in Science in 1979, William Broad details the political drama that allowed the “barefoot boys of nutrition” from McGovern’s committee to put nutrition in the hands of the USDA.

Why Race Doesn’t Matter in Nutrition Policy

This is the first of a series looking at what does and doesn’t matter when it comes to nutrition policy. When I started out on this adventure, I thought that science would give me the answers to the questions I had about why public health and clinical recommendations for nutrition were so limited. Silly me. The science part is easy. But policy, politics, economics, industry, media framing, the scientific bureaucracy, cultural bias—now that stuff is crazy complicated. It’s like an onion: when you start peeling back the layers, you just want to cry. I am also honored to say that this post is part of the Diversity in Science Carnival on Latino / Hispanic Health: Science and Advocacy

When we began investigating relationships between diet and chronic disease, we didn’t pay much attention to race. The longest-running study of the relationship between dietary factors and chronic disease is the Framingham Heart Study, a study made up entirely of white, middle-class participants. Since 1951, the Framingham study has generated over 2 thousand journal articles and retains a central place in the creation of public health nutrition policy recommendations for all Americans.

More recent datasets—especially the large ones—are nearly as demographically skewed.

The Nurses’ Health Study is 97% Caucasian and consists of 122,000 married registered nurses who were between the ages of 30 and 55 when the study began in 1976. An additional 116,686 nurses ages 25 – 42 were added in 1989, but the racial demographics remained unchanged.

The Health Professionals’ Follow-up Study began in 1986, as a complementary dataset to the Nurses’ Health Study. It is 97% Caucasian and consists, as the name suggests, of 51, 529 men who were health professionals, aged 40-75, when the study began.

The Physicians’ Health Study began in 1982, with 29, 071 men between the ages of 40-84. The second phase started in 1997, adding men who were then over 50. Of participants whose race is indicated, 91% are Caucasian, 4.5% are Asian/Pacific Islander, 2% are Hispanic, and less than 1% are African-American or American Indian. I have detailed information about the racial subgroups of this dataset because I had to write the folks at Harvard and ask for them. Race was of such little interest that the racial composition of the participants is never mentioned in the articles generated from this dataset.

Over the years, these three mostly-white datasets have generated more journal articles than five of the more diverse datasets all put together.* These three datasets, all administered by Harvard, have been used to generate some of the more sensationalist nutrition headlines of the past few years–red meat kills, for instance–with virtually no discussion about the fact that the findings apply to a population–mostly white, middle to upper middle class, well-educated, health professionals, most of whom who were born before the atomic bomb–to which most of us do not belong.

Shift in demographics in past 50 years;
predicted shift in next 50 years

Although we did begin to realize that race and other characteristics might actually matter with regard to health (hence the existence of datasets with more diversity), we can’t really fault those early researchers for creating such lopsided datasets. At that point, not only was the US more white than it is now, scientific advances that would reveal more about how our genetic background might affect health had not yet been developed. We had not yet mapped the human genome; epigenetics (the study of the interaction between environmental inputs and the expression of genetic traits) was in its infancy, and biochemical individuality was little more than a glimmer in Roger Williams’ eye.

Socially, culturally, and I think, scientifically, we were all inclined to want to think that everyone was created equal, and this “equality” extended to how our health would be affected by food. Stephen Jay Gould’s 1981 book, The Mismeasure of Man, critiqued the notion that “the social and economic differences between human groups—primarily races, classes, and sexes—arise from inherited, inborn distinctions and that society, in this sense, is an accurate reflection of biology.” In the aftermath of the civil rights movement, with its embarrassingly racist behavior on the part of some representatives of the majority race and the heartbreaking violence over differences in something as superficial as skin color, it was patently unhip to suggest that racial differences were anything more than just skin deep.

But does that position still serve us now?

In the past 35 years, our population has become more diverse and nutrition science has become more nuanced—but our national nutrition recommendations have stayed exactly the same. The first government-endorsed dietary recommendations to prevent chronic disease were given to the US public in 1977. These Dietary Goals for Americans told us to reduce our intake of dietary saturated fat and cholesterol and increase our intake of dietary carbohydrates, especially grains and cereals in order to prevent obesity, diabetes, heart disease, cancer, and stroke.

Since 1980, the decreases in hypertension and serum cholesterol—health biomarkers—have been linked to Guidelines-directed dietary changes in the US population [1, 2, 3, 4].

“Age-adjusted mean Heart Disease Prevention Eating Index scores increased in both sexes during the past 2 decades, particularly driven by improvements in total grain, whole grain, total fat, saturated fatty acids, trans-fatty acids, and cholesterol intake.” [1]

However, with regard to the actual chronic diseases that the Dietary Guidelines were specifically created to prevent, the Dietary Guidelines have been a resounding failure. If public health officials are going to attribute victory on some fronts to Americans adopting dietary changes in line with the Guidelines, I’m not sure how to avoid the conclusion that they also played a part in the dramatic increases in obesity, diabetes, stroke, and congestive heart failure.

If the Dietary Guidelines are a failure, why have policy makers failed to change them?

It is not as if there is an overwhelming body of scientific evidence supporting the recommendations in the Guidelines. Their weak scientific underpinnings made the 1977 Dietary Goals controversial from the start. The American Society for Clinical Nutrition issued a report in 1979 that found little conclusive evidence for linking the consumption of fat, saturated fat, and cholesterol to heart disease and found potential risks in recommending a diet high in polyunsaturated fats [5]. Other experts warned of the possibility of far-reaching and unanticipated consequences that might arise from basing a one-size-fits-all dietary prescription on such preliminary and inconclusive data: “The evidence for assuming that benefits to be derived from the adoption of such universal dietary goals . . . is not conclusive and there is potential for harmful effects from a radical long-term dietary change as would occur through adoption of the proposed national goals” [6]. Are the alarming increases in obesity and diabetes examples of the “harmful effects” that were predicted? It does look that way. But at this point, at least one thing is clear: in the face of the deteriorating health of Americans and significant scientific evidence to the contrary, the USDA and HHS have continued to doggedly pursue a course of dietary recommendations that no reasonable assessment would determine to be effective.

But what does this have to do with race?

Maintaining the myth that a one-size diet approach works for everyone is fine if that one-size works for you—socially, financially, and in terms of health outcomes. The single positive health outcome associated with the Dietary Guidelines has been a decrease in heart attacks—but only for white people.

And if that one-size diet doesn’t fit in terms of health, if you end up with one of the other numerous adverse health effects that has increased in the past 35 years, if you’re a member of the mostly-white, well-educated, middle/upper-middle class demographic—you know, the one represented in the datasets that we continue to use as the backbone for our nutrition policy—you are likely to have the financial and social resources to eat differently from the Guideline recommendations should you choose to do so, to exercise as much as you need to, and to demand excellent healthcare if you get sick anyway. Even if you accept that these foods are Guidelines-recommended “healthy” foods, you are not stuck with the commodity crop-based processed foods for which our nutrition programs have become a convenient dumping ground.

In the meantime, low-income women, children, and minorities and older adults with limited incomes—you know, the exact population not represented in those datasets—remain the primary recipients of federal nutrition programs. Black, Hispanic, and American Indian kids are more likely to qualify for free or reduced-price school lunches; non-white participants make up 68% of the Special Supplemental Nutrition Program for Women, Infants, and Children enrollment. These groups have many fewer social, financial, and dietary options. If the food they’re given doesn’t lead to good health—and there is evidence that it does not—what other choices do they have?

When it comes to health outcomes in minorities and low-income populations, the “healthier” you eat, the less likely you are to actually be healthy. Among low-income children, “healthy eaters” were more likely to be obese than “less-healthy eaters,” despite similar amounts of sedentary screen time. Among low-income adults, “healthy eaters” were more likely to have health insurance, watch less television, and to not smoke. Yet the “healthy eaters” had the same rates of obesity as the “less-healthy heaters” and increased rates of diabetes, even after adjustment for age.

These associations don’t necessarily indicate a cause-effect relationship between healthy eating and health problems. But there are other indications that being a “healthy eater” according to US Dietary Guidelines does not result in good health. Despite adherence to “healthy eating patterns” as determined by the USDA Food Pyramid, African American children remain at higher risk for development of diabetes and prediabetic conditions, and African American adults gain weight at a faster pace than their Caucasian counterparts [7,8].

Adjusted 20-year mean weight change according to low or high Diet Quality Index (DQI) scores [8]

In this landmark study by Zamora et al, “healthy eaters” (with a high DQI) were compared to “less-healthy eaters” (with a low DQI). Everyone (age 18-30 at baseline) gained weight over time; the slowest gainers—white participants who were “healthy eaters”—still gained a pound a year. More importantly however, for blacks, being a “healthy eater” according to our current high-carbohydrate, low-fat recommendations actually resulted in more weight gain over time than being a “less healthy eater,” an outcome predicted by known differences in carbohydrate metabolism between blacks and whites [9].

Clearly, we need to expand our knowledge of how food and nutrients interact with different genetic backgrounds by specifically studying particular racial and ethnic subpopulations. Social equality does not negate small but significant differences in biology. But it won’t matter how much diversity we build into our study populations if the conclusions arrived at through science are discarded in favor of maintaining public health nutrition messages created when most human beings studied were of the adult, mostly white, mostly male variety.

Right now the racial demographics of the participants in an experimental trial or an observational study dataset doesn’t matter, and the reason it doesn’t is because the science doesn’t matter. What really matters? Maintaining a consistent public health nutrition message—regardless of its affect on the health of the population—that means never having to say you’re sorry for 35 years of failed nutritional guidance.

*ARIC – Atherosclerosis Risk In Communities (1987), 73% white; MESA – Multi Ethnic Study of Atherosclerosis (2000), 38% white, 28% African American, 12% Chinese, 22% Hispanic; CARDIA – Coronary Artery Risk Development in Young Adults (1985), 50% black, 50% white; SHS – Strong Heart Study (1988), 100% Native American; BWHS – Black Women’s Health Study(1995), 100% black women.


1. Lee S, Harnack L, Jacobs DR Jr, Steffen LM, Luepker RV, Arnett DK. Trends in diet quality for coronary heart disease prevention between 1980-1982 and 2000-2002: The Minnesota Heart Survey. J Am Diet Assoc. 2007 Feb;107(2):213-22.

2. Hu FB, Stampfer MJ, Manson JE, Grodstein F, Colditz GA, Speizer FE, Willett WC. Trends in the incidence of coronary heart disease and changes in diet and lifestyle in women. N Engl J Med. 2000 Aug 24;343(8):530-7.

3. Fung TT, Chiuve SE, McCullough ML, Rexrode KM, Logroscino G, Hu FB. Adherence to a DASH-style diet and risk of coronary heart disease and stroke in women. Arch Intern Med. 2008 Apr 14;168(7):713-20. Erratum in: Arch Intern Med. 2008 Jun 23;168(12):1276.

4. Briefel RR, Johnson CL. Annu Rev Nutr. 2004;24:401-31. Secular trends in dietary intake in the United States.

5. Broad, WJ. NIH Deals Gingerly with Diet-Disease Link. Science, New Series, Vol. 204, No. 4398 (Jun. 15, 1979), pp. 1175-1178.

6. American Medical Association. Dietary goals for the United States: statement of The American Medical Association to the Select Committee on Nutrition and Human Needs, United States Senate. R I Med J. 1977 Dec;60(12):576-81.

7. Lindquist CH, Gower BA, Goran MI Role of dietary factors in ethnic differences in early risk of cardiovascular disease and type 2 diabetes. Am J Clin Nutr. 2000 Mar; 71(3):725-32.

8. Zamora D, Gordon-Larsen P, Jacobs DR Jr, Popkin BM. Diet quality and weight gain among black and white young adults: the Coronary Artery Risk Development in Young Adults (CARDIA) Study (1985-2005). American Journal of Clinical Nutrition. 2010 Oct;92(4):784-93.

9. Hite AH, Berkowitz VG, Berkowitz K. Low-carbohydrate diet review: shifting the paradigm. Nutr Clin Pract. 2011 Jun;26(3):300-8. Review.

Why Fat is Still a Feminist Issue

[circa 1975]

Sing along when the chorus rolls around (with apologies to Helen Reddy):

Yes I ate brown rice
And anything whole grain
Yes I’ve exercised
And look how much I’ve gained
If I have to, I won’t eat anything
I am fat
I am invisible

The United Nations declared 1975 to be International Woman’s Year. Unfortunately, we haven’t really come a long way, baby, since then. Right now, I’m going to sidestep the whole media-generated body image issue, the glass labyrinth, the mommy wars, the “strong is the new sexy” idea (which somehow won out over my own personal favorite “smart is the new sexy” with campaign ads of slightly-unwashed-looking ladies without pedicures huddled over lab benches) and all the other complexities of contemporary feminist theory, and just focus on one little segment of how our national nutrition recommendations might have sucked the life out of women in general for the past 30 plus years.

We’ve been acting like the whole low-fat/low-glycemic/low-carb/paleo/whatever nutrition argument is a PubMed duel between scientists, and the fact that we are surrounded by lousy, nutrient-poor, cheap food is the fault of the Big Evil Food Industry. Let’s focus our attention regarding the current health crisis in America where it really belongs: on short-sighted, premature, poorly-designed (albeit well-intentioned) public health recommendations that were legitimized with the 1977 Dietary Goals for Americans and institutionalized as US policy beginning with the 1980 Dietary Guidelines for Americans.  Yes, fat is still a feminist issue.  But I’m not talking about body fat.

The scientific underpinnings for these recommendations came primarily from studies done with white men. And although the science conducted on these white guys was generally inconclusive, the white guys in Washington—in an attempt to prevent what they saw as a looming health crisis in America—recommended that Americans consume a diet high in carbohydrates and low in fat. And although these premature recommendations have certainly not prevented any health crises in America (the appearance seems to be just the opposite, see: Public Health Nutrition’s Epic Fail), they’ve also had serious repercussions in other respects for the rest of us, i.e. the ones of us who are not white men. [Please don't take this as a "I hate white guys" thing; I love white guys. I gave birth to two of them.] I’m going to get into the “not white” part of the equation in another post (perhaps unimaginatively titled, Why Nutrition a Racial Issue), but let me focus just on the “not men” part.

For those of us who are not men (and mostly not poor and not minority), the 1970’s brought us Charlie’s Angels and the Bionic Woman. Women were given the message that we should be able to do and have “it all” (whatever “it all” was). The expectation was that you could “bring home the bacon, fry it up in a pan” and be thin, gorgeous, and sexy (and white) while you did it.

[circa 1980]

Only now bacon (and eggs for that matter) was forbidden, and as the eighties evolved into the nineties, breakfast became Special K with skim milk, slopped into a bowl which we ate from while we drove the kids to school on our way to the job where we got paid less than the men with whom we worked. All the while, we were convinced that we could continue to fit into our tailored power suits by eating a diet that wasn’t designed with our health in mind.

[bacon eggs frowny face, circa 1984]

As with nearly every other aspect in the fight for equal opportunities and treatment, our health as women was based on a single shiny little myth: success would come to those who were willing to work hard, sacrifice, and follow the rules. Airbrushed media images of buns of steel and boobies of plastic sold a diet-exercise message based on an absurdly crude formula—”calories in, calories out”— with one simple rule that would guarantee success: “eat less and move more.”

So we did. We ate less and exercised more and got really tired and hungry and cranky—and when that didn’t really work, we bought treadmills and diet pills, Lean Cuisines and leg warmers. We got our health advice from Jane (“feel the burn”) Fonda and Marie (“I’m a little bit country”) Osmond. We flailed through three decades of frustration, culminating— unsurprisingly enough—in the self-flagellation of Spanx® and the aptly-named Insanity®.

[Jane Fonda circa 1982]

Some of us “failed” by eating more (low-fat, high-carb) food and getting fat, and some of us “succeeded” by developing full-blown eating disorders, and some of us fought the battle and won sometimes and lost other times and ended up with closets full of size 6 (“lingering illness”) to size 26 (“post pregnancy number 3″) clothes. All of us—no matter what the result—ended up spending a great deal of time, money, and energy trying to follow the rules to good health with the deck stacked against us. If we got fat, we blamed ourselves, and if we didn’t get fat it was because we turned our lives into micromanaged most-virtuous eater/exerciser contests. Either way, our lives were reduced, distracted, and endlessly unsatisfying.  We were hungry for more in so many ways and aching for rest in so many others, but our self-imposed denial and exhaustion allowed us to control, at least for a bit, the one thing we felt like we could control, that we’d fought to be able to control:  our bodies.

We stopped cooking and started counting. We stopped resting and playing and started exercising. We stopped seeing food as love and started seeing it as the enemy. We didn’t embrace these bodies that were finally, tenuously, ours; we fought them too.

Access to high quality nutrition has always been divided along gender lines [1].  There was a time–not that long ago–in our world when men, by virtue of their size, stature, place as breadwinner (i.e. because of their “man-ness”) were entitled to a larger piece of meatloaf than their sisters (a practice that persists in many cultures still).  How many of us (of a certain age) have heard, “Let you brother have the last piece of chicken, he’s a growing boy”?  Now–conveniently–women would do their own restricting.  Gloria Steinem, with a fair amount of prescience that seems to predict the epigenetic contributions of diet to obesity, noted in her 1980 essay The Politics of Food:*

“Millions of women on welfare eat a poor and starchy diet that can permanently damage the children they bear, yet their heavy bodies are supposed to signify indulgence.  Even well-to-do women buy the notion that males need more protein and more strength.  They grow heavy on sugar and weak on diets . . . Perhaps food is still the first sign of respect–or the lack of it–that we pay to each other and to our bodies.”

Dieting and exercising not only provided a massive distraction and timesuck for women, it helped maintain a social order where women were weak and undernourished that the feminist movement otherwise threatened to undermine.

And when the scientists finally got around to testing the whole low-fat thing on (80% white) women? The verdict, published in  2006, looked like this:

The results, published in the Journal of the American Medical Association, showed no benefits for a low-fat diet. Women assigned to this eating strategy did not appear to gain protection against breast cancer [2], colorectal cancer [3], or cardiovascular disease [4]. And after eight years, their weights were generally the same as those of women following their usual diets [5].

But it was too late. We’d raised a generation of daughters who look at us and don’t want to be us, but they don’t know how to cook and they don’t know what to believe about nutrition and they are afraid of food. Some end up drinking the same kool-Aid we did, except that—in the hubris of a youth that doesn’t contain hallucination-inducing sleep deprivation from babies and/or stress and/or a career on life-support, where diet and exercise and rest are, like Peter Frampton’s hair, a dim memory—they think they will succeed where we failed. Or maybe they’ve found the vegan-flavored or paleo-flavored kool-Aid. But they are still counting and exercising and battling.

White women have been [irony alert] scientifically proven to follow the high-carb, low-fat dietary ideal set forth by the Dietary Guidelines more closely than any other demographic [6]. (Black guys—who may not be all that convinced that rules created by the US government are in their best interests, given some history lessons—are likely to have the lowest adherence.) White women apparently are really good at following rules that were not written with them in mind and which have not been shown to offer them any health benefits whatsoever (but which have proven immensely beneficial for the food and fitness—not to mention pharmaceutical—industries). The best little rule-followers of all are the dietitians of the Academy of Nutrition and Dietetics (87% white women), who heartily endorsed the 2010 Dietary Guidelines, which reinforced and reiterated 30 years of low-fat, high-carb dogma despite the Harvard-based science that demonstrated that it offered no benefits to women. (Interesting tidbit: The Academy of Nutrition and Dietetics has elected two male presidents in the past decade despite the fact that men make up only 5% of the membership. My husband thinks the organization has “daddy issues.”)

In 2010, the American Medical Association recommended that women of normal weight (that’s less than 40% of us, by the way) who wanted to stay that way “while consuming their usual diet” (i.e. low-fat, high carb) would have to exercise for an hour a day

[Other reassuring conclusions from that study: There was an overall weight gain over the 13-year time frame. Exercising for anything less than 7 hours per week was associated with weight gain over time. If a woman was already fat, increased exercise was more likely to be related to increased weight than weight loss.  If these messages don't scream to women all over America, "GIVE UP NOW!!!" I don't know what would. By the way, those of us who go out and skip and jump and run because we like to and it makes our hearts truly happy are not exercising. We're playing. I love to wave at those women from my couch.**]

But let’s get back to that hour a day for just a second.

Take a look at a recent study by Dr. David Ludwig, out of Harvard. It demonstrated that people who had recently been dieting (something that would apply to almost every woman in America), and were eating a low-fat diet, had to add an hour a day of exercise in order to keep their “calories in, calories out” balanced, while those on a very low-carb diet expended that same amount of energy just going about their business.

What if all the women in America who have been unsuccessfully battling their bulge woke up tomorrow morning and said:

I want my hour a day back?

For those of us who do not want to exercise for an hour just to maintain our weights or for those of us for whom exercise isn’t doing a damn thing except making us hungry and cranky and tired while we gain weight, we don’t have to. Instead, we can eat fewer of those USDA/HHS/dietitian-pushed, nutritionally-pathetic, low-fat whole-grain carbohydrate foods and more truly nourishing food and do whatever we please with that extra hour.

Who knows what changes we can make to a world that desperately needs our help when around–ooh let’s just say–50 million adult women in America have an extra hour a day? That’s an extra 365 hours a year per woman, an extra 18 billion hours of womanpower a year total.

We could stop exercising and start playing. Stop counting calories and start enjoying feeling nourished. Start putting the love back into our food and embracing the bodies we have and the bodies of the men, women, and children all around us. I know that some of us would find that hour well spent just napping. Others of us might use that hour to figure out how to dismantle the system that stole it from us in the first place.

I can bring home the bacon, fry it up in a pan. And eat it.


In my own personal celebration of Asskicking Women of Food, I think (I hope) my next post will be:  The Grande Dames (Goddesses? Queens?) of Nutrition

*Thanks to Gingerzingi for bringing this to my attention.  What a great essay–look for it in a collection entitled Outrageous Acts and Everyday Rebellions.

**I have absolutely nothing against activities that bring inner/outer strength and happiness.  But exercise in the 80s and 90s was not about being happy or strong–it was about punishing ourselves (feel the burn? seriously?) in order to win at a game–being in total control of everything in our lives from babies to bodies to boardrooms–whose rules were created within the very social construct we were trying to defeat.


1.  Bentley, Amy (1996) Islands of Serenity: Gender, Race, and Ordered Meals during World War II. Food and Foodways 6(2):131-156.

2. Prentice RL, Caan B, Chlebowski RT, et al. Low-fat dietary pattern and risk of invasive breast cancer: the Women’s Health Initiative Randomized Controlled Dietary Modification Trial. JAMA. 2006; 295:629-42.

3. Beresford SA, Johnson KC, Ritenbaugh C, et al. Low-fat dietary pattern and risk of colorectal cancer: the Women’s Health Initiative Randomized Controlled Dietary Modification Trial. JAMA. 2006; 295:643-54.

4. Howard BV, Van Horn L, Hsia J, et al. Low-fat dietary pattern and risk of cardiovascular disease: the Women’s Health Initiative Randomized Controlled Dietary Modification Trial. JAMA. 2006; 295:655-66.

5. Howard BV, Manson JE, Stefanick ML, et al. Low-fat dietary pattern and weight change over 7 years: the Women’s Health Initiative Dietary Modification Trial. JAMA. 2006; 295:39-49.

6.  Sijtsma FP, Meyer KA, Steffen LM et al.  Longitudinal trends in diet and effects of sex, race, and education on dietary quality score change: the Coronary Artery Risk Development in Young Adults study. Am J Clin Nutr. 2012 Mar;95(3):580-6. Epub 2012 Feb 1.