The Real Paleo Challenge Redux

The original title of my presentation for the Ancestral Health Symposium 2013 was:

But now I feel like it should be more like:

What’s going on in Paleoland? Well, you can see Melissa McEwan’s take on it here, or itsthewoo’s take on it here. My concerns about paleo are wrapped up in the presentation below, and going into AHS 2013 I was more than a little nervous about saying what it is I wanted to say. See, I don’t consider myself “paleo” (or “low carb” or “insert whatever diet therapy you think I adhere to here”); I consider myself a nutritionist, a public health professional, and work in progress. I do recognize the fact that a lot of people who do consider themselves “paleo” attend AHS–and I consider a lot of them my friends and colleagues. While I see promising things in the group of people who have chosen a paleo path, I also agree with a great deal of what both Melissa McEwan and itsthewoo have to say. (I admit to some sadness over the demise of Paleodrama. Other people binge-watch House of Cards. Me, after a long week of rhetorical theory and critical studies, I would grab a tumbler of sangria and binge-read Paleodrama. To each her own.) The presentation would, I hoped, put some of the “issues” that I see happening in Paleoland on the table, without throwing out the potential for paleo to grow into something more than itself. Well.

Without further ado, here’s the presentation as it was in August. Updates and commentary that did not appear in the original are in [brackets].

It is an honor to be here at AHS and I am delighted to be in such esteemed company. I hope that I can bring to our conversations this weekend a little something to offend everyone.

The primary misconception that I deal with in public health nutrition is that our current policy is the same thing as science. Conversely, a primary misconception regarding reforming this policy is the idea that “If only we could get the right information to the public and to policymakers, things would be different.” Having the evidence to support a movement’s agenda is important, but public perceptions and national policies are shaped as much by social, political, and cultural forces as by science.

As we have seen in other movements, cultural change drives policy change, which in turn drives cultural change. The current mainstream definition of what constitutes a “healthy” diet is an excellent example of this. At one point in the not-too-distant past, a low-fat, low-calorie, plant-based diet was considered a “fad” – just as the stereotypical paleo diet is today. But it was not science alone—or even primarily—that shifted the public’s perceptions.

In fact, the science supporting this dietary guidance has been and remains weak, but that didn’t stop it from becoming policy. George McGovern’s Senate Select Committee, a group of young white liberal men full of well-meaning social concern, wanted to create a plan to reduce chronic disease (a reasonable public health goal), as well as lengthen the lifespan of their committee. They did their work against a backdrop of post-World War 2 wealth, comfort, and suburban complacency that was rapidly crumbling in the face of social movements that would polarize the population: civil rights, women’s liberation, and anti-war protests. Television brought bombings, riots, assassinations, and Watergate, into middle class living rooms and shook middle class faith in government and social order. Middle class complacency was quickly turning into anxiety and cynicism.

Some of this anxiety took shape specifically around matters related to food and health. Ancel Keys taught the public his theories about heart disease–a “disease of success” brought on by too much animal fat. Rachel Carson raised awareness of environmental toxins. Ralph Nader and the Center for Science in the Public Interest raised the alarm about chemicals in our food supply put there by corporate greed—a force which also was accused of contributing to hunger in America. Many groups, from feminists to Beatles fans, picked up on these issues—along with ethical concerns about animal welfare—by turning towards vegetarian diets. McGovern’s committee—as they said back then—was hip to all of this.

This is clear in their choice of reference material for the Dietary Goals, which included—of all things—a cookbook called Diet for a Small Planet. As much vegetarian manifesto as a source for recipes, it proposed that a plant-based diet was the best way to feed the hungry, save the Earth, protect our health, and usher in the Age of Aquarius. [It still does.] This cookbook assured middle-class America that what was good for us was also good for the world. Its influence is felt throughout the 1977 Goals, which counseled Americans to reduce consumption of meat, eggs, butter, and full-fat dairy, and increase intake of grains, cereals and vegetables oils, recommendations that have changed very little in nearly 40 years.

McGovern’s committee wanted to return America to a more “natural” way of eating—and what could be wrong with that? This “back to nature” stance earned the Committee the nickname “the barefoot boys of nutrition.” This “back to nature” idea not only recalled the “physical culture movement” that had long been a part of American life, it resonated with Puritan ethics that suggested that self-discipline and a little suffering—which Americans were going to need for such a radical change in diet—were a mark of moral goodness. Barefoot and back to nature, fresh air, sunshine and a little suffering—does any of this sound familiar?

Those initial Dietary Goals did not embed themselves in American culture based on the strength of their science—to say the least. They grabbed the attention of the media and the middle class because they played on the existential anxieties that cultural turmoil creates. They substantiated a notion that by changing their diets, Americans could control some of the frightening things in the world—hunger, pollution, disease. We could demonstrate just how much we cared about these issues, and we could do it from the comfort and safety of our own dinner table. We are still trying to do that even now.

Our current calls for reform in the areas of food, nutrition, and health reflect the same set of complex social problems, the same inescapable environmental problems, the same threats to our food supply that the creators of the 1977 Goals faced—only compounded by time, technological advances, and a distinct turn for the worse in the country’s (and the world’s) health.

The paleo community emerged as a protest against dietary guidance that seems to many to be scientifically shoddy, shallow, limited, and ineffective. The attention to calorie balance as the only way to maintain health seems to be especially—and unnecessarily—restrictive and unhelpful. But “paleo” in its stereotyped form takes a shape that is little different from the one to which it stands in opposition.

Both of approaches to nutrition are stuck in the past in two primary aspects:

Both suggest a linear and mechanistic approach to the food-health relationship. “Eat this/don’t eat that and all will be well.”

Second, and more subtly, both approaches reflect the cultural values and social power of those doing the reforming, but may not reflect the realities of the most vulnerable in our population, the ones who might benefit most genuine changes to the system.

People have been burnt once already by a “nutrition revolution” – they are confused, skeptical, and wary. They don’t want to get fooled again. Right now, paleo is not offering much that is truly revolutionary in terms of a new way to approach to food and health. Unless and until we are ready to give up some of the same concepts that we criticize the mainstream approach for using–it’s really just “meet the new boss, same as the old boss.”

We can’t generate the outrage we need to change the public’s world view, because we have not decided what our own priorities are: Do we care only about our own food and health, or do we care about everyone’s food and health?

With regard to food, current nutrition policies are a barrier to the growth of local food systems.

Farmers have difficulty expanding the market for locally-produced animal products because of dietary guidance that limits saturated fat and cholesterol intake. Meanwhile the paleo community is in upheaval for days—weeks, months?—debating the worthiness of butter from cows that are only 90% and not 100% grass-fed. How can we support long-term sustainable growth in local systems when our own standards are incoherent and possibly unreasonable?

We want our meat, eggs, and butter to come from happy, healthy cows and chickens. But what attention are we willing to spare for the health and happiness of farm workers—or the workers up and down the food supply chain?

With regard to health, nutrition is a civil rights issue.

We don’t want our wellness determined by an arbitrary marker like LDL, but are we willing to go to bat for someone else whose wellness is determined by an arbitrary marker like BMI?

The paleo community spends its energy debating how various sugars and starches may or may not be paleo. This is fascinating, but will it help people with diabetes who are never offered an alternative to a low-fat diet—despite the science that demonstrates the benefits of a carbohydrate-reduction in treating this disease?

The current nutrition paradigm use moldy datasets normed on white female healthcare professionals born during the first half of the last century to inform the dietary health of dark-skinned young males all over America. But is suggesting they return to their caveman roots any more appropriate?

These are huge issues—wicked problems—and we can’t fix them by replacing the old rules with some new ones. In order to be a leading force in the kind of social movement that might create authentic change in the system, paleo is going to have to move beyond the limited perspective that perpetuates many of the mistakes of the current nutrition paradigm.

I propose that we consider the idea of ancestral health—as distinct from “paleo”–as a way of framing food and nutrition reform to address both the cultural and the scientific limitations of previous approaches.

In terms of science, anthropology and evolutionary biology have shown us that diet is idiosyncratic and variable within and between populations, but not chaotic; there are certain nutritional requirements, but there are many ways to meet them.

Research into the human microbiome has shown us that we are not alone; and that the health of the microbial communities within and around us is a critical aspect of our own health.

Epigenetics, genomics, and other aspects of systems biology have begun to reveal the complexity of interactions between our genetic material and our environment, with food being a primary, but by no means the only, environmental exposure.

All of these concepts can and should be part of the ancestral health framework.

But as I said at the beginning, science is not enough. There are three critical components that turn a protest into a movement.

1) Development of widely-shared cultural norms, the violation of which is perceived as injustice. In order to develop those norms, we’re going to have to do some GROWING UP.

2) Development of a repertoire of actions that demonstrate that conditions can be altered. In order to create the sense of agency and change that we want, we are going to have to start DIGGING IN.

3) Development of a dense social networks that can work collectively against a common target. In order to create these alliances, we are going to have to begin REACHING OUT.

Growing up for paleo—as for many things—will to need to start with a little makeover. Like all good makeovers, this doesn’t mean abandoning the paleo identity completely, but it means looking—and moving—beyond it. There are precedents for this from other nutrition reform arenas.

For many people, hearing the term “vegan” bring a knee-jerk—and negative—reaction; but the term “vegetarian” does not. People who promote a vegan diet know this and can frequently be found using the term “vegetarian” instead. So that’s a marketing strategy, and a fairly wise one.

Now, take the phrase “Atkins diet” which can also elicit a negative, knee-jerk reaction. But scientists who study such diets have learned to use the phrase “reduced-carbohydrate” not only for PR purposes, but because the phrase “Atkins diet” does not encompass the different approaches to carbohydrate reduction that scientists are interested in.

How about paleo? It also elicits a negative, knee-jerk reaction from many and calls up stereotypes of privileged white males eating big hunks of meat on a stick—even though, as Hamilton Stapell showed us, those stereotypes may be somewhat inaccurate. As such, the term “paleo” limits what we can expect to accomplish as a framing device for conversations about food, health, and lifestyle. From this point forward I will use the term “paleo” to refer to the stereotyped and limited perspective and “ancestral health” to refer to an expanded and comprehensive approach to food-health reform.

By shifting the shared norms of our community towards an ancestral health framework—rather than being limited to paleo—we can move beyond the outdated concepts that we share with the current approach to nutrition and the problems that they create. We can—if we choose to—use an ancestral health framework to challenge those assumptions in a truly radical way.

[What follows is what I call the Top Ten Reasons Paleo Pisses Me Off, but my hubby, ever the diplomat, said not to say that.]

[Reason 10:]  So let’s just get this out there: The first assumption we need to challenge is the one that equates body size with health, which is interesting since according to Dr. Stapell, both of these are primary reasons to become part of the paleo community.

Mainstream approaches indicate that overweight and obese Americans need to eat less and move more to achieve a healthy weight according to an arbitrary cut-off on a simplistic measuring tool.

The paleo approach suggests that maybe strong is the new skinny. Or maybe “strong” is just another superficial way of assessing another’s worth.

The problem is that attention to body size rather than health and functionality can lead to a moralizing and pathologizing perspective that doesn’t reflect reality. Not only can this approach foster disordered eating behaviors and judgment calls about food, character, and lifestyle choices, it tells us little about overall health. We have no way of knowing, looking at these two women (Brittany on the left and Jennifer on the right—no headless women here), who eats what kind of food, who is healthy now, or who is going to live a long and functional life.

Our challenge is to use the ancestral health framework to recognize that a multiplicity of body shapes can be healthy and functional, and to acknowledge that much of body shape and size is determined genetically and can be influenced by factors other than diet and exercise. De-emphasizing body shape/size brings our focus to health, and especially for women, inter-generational health.

Women can—and do—have bellies, butts, and bingo flaps. Sisters who rock the paleo hardbody look—more power to you. Sisters who are more the Venus of Willendorf type—more power to you too. We can all meet at the pool and compare muscles & bra sizes & bingo flaps—and just get over ourselves and any fear of somebody tweeting about our butts.

[Reason 9:] Growing up also means moving beyond the idea that food and nutrition are the same thing.

Typical nutrition guidance discusses food as if all food choices are based only on nutrition.

Yeah, we tend to do the exact same thing.

The Problem: People are concerned about a lot of other things besides nutrition. Usually cost, convenience, and taste come first–

–followed by a host of other considerations, only one of which is nutrition.

An ancestral approach to food can embrace all of the factors that impact our food choices because it can look at food in its cultural—as well as biological—context. It can highlight the role of environmental stressors in overall health–including economic and time pressures that also impact food choices. Acknowledgement of food communities allows us to explore the role food beliefs and preferences play in food choices; these too are part of an anthropological and evolutionary perspective on food-health relationships.

[Reason 8:]  We need to move past the idea that food is medicine.

Mainstream nutrition has promised that a low-saturated fat ,low-cholesterol, low-calorie, low-sodium, whole grain diet will prevent chronic illnesses like heart disease, cancer, and diabetes.

Us: Same promise, different food.

Now, I’m not going to say that the paleo paradigm doesn’t have some better biochemistry behind it; in many [but not all] respects, it does. The problem is that food is still not medicine.

A nutritionally-appropriate diet should be the foundation of good health, but it doesn’t guarantee it. Both groups are making promises they can’t keep & this leads to skepticism, cynicism, and disillusionment. Most importantly, this framework take a complex social construct and a biological necessity—food—and reduces it to a mechanistic and simplistic intervention–medicine.

Medicine is for sick people and food is for everyone. We may use food as part of a therapy to “heal” a particular condition at a particular point in time, but that is not the same thing as a public health paradigm. We put casts on broken legs, but we don’t recommend that everyone wear casts in order to prevent legs from breaking.

An ancestral health approach offers an opportunity to move away from the view of the human condition as one of potential “illness” to be “avoided” to one of wellness to be maintained.  By focusing first and foremost on essential nutrition—and the many appropriate ways that it can be acquired–the emphasis is on having health, not preventing chronic disease. The recognition of the complexities of what we know and don’t know about the relationships between food and health brings into the public health forum other important aspects of lifestyle—sleep, stress, play, activity—that can contribute to health and well being.

[Reason 7:] There is no small irony in the fact that both plant-based and paleo ideology emphasize a return to “a more natural way of eating.” How does that happen? Because the notion of “a more natural way of eating” is not something that is easy to define. [More generally, the emphasis on a more "natural" way of doing things is a rhetorical device that implies "goodness" and fails to evaluate the issue at hand on its own terms.]

Mainstream nutrition suggests that returning to a “more natural” diet means eating a lot foods that our ancestors DIDN’T eat—either in the near or distant past—like vegetable oils, and avoiding a lot of foods they DID eat, like butter, eggs, meat, and lard.

Paleo suggests that returning to a “more natural” diet means NOT eating a lot of foods that our ancestors DID eat—at least in the not too distant past—like bread, legumes, and dairy, [and eating a lot of foods they DIDN'T eat i.e. coconut milk, unless your ancestors were Thai].

The problem is that “natural” is term useful for marketing, but not much else. It isn’t a scientific concept, or even one that makes a lot of sense culturally. We don’t really have a lot of solid information about what was “natural” for our distant ancestors—and the gene/environment interactions that may have occurred since then may make that information less relevant than how our more-recent ancestors lived, ate, and worked.

Here’s our challenge: Ancestral health principles got their start by focusing on paleolithic times—and that perspective is a valuable one—but we don’t have to be limited to that. An ancestral health framework can also allow us to look to the near-past for clues about our health now, should we choose to. Here’s the beauty of this approach: It’s already been sanctioned by mainstream nutrition, and by two of the leaders in nutrition reform, Michael Pollan and Gary Taubes.

In his landmark 1985 article, Sick Individuals and Sick Populations, epidemiologist Geoffrey Rose called for “The restoration of biological normality by the removal of” among other things “recently-acquired dietary deviations.” Gary Taubes indicates that Weston A. Price’s work about the health impacts of introducing new foods into native diets as the “most influential” thing he read in researching Good Calories, Bad Calories. Michael Pollan suggestion that we eat the way our great-grandparents ate has become a rallying cry for many people interested in food reform.

[The pie chart above] is a pretty reasonable picture of an “ancestral diet” from 1955 America: we got about half of our calories from plant-based starches and sugars—only 10% of those as fruits and vegetables—and about half from mostly animal-based proteins and fats. I’m not saying this is a perfect diet, but it does seem to be the one we were eating before the rapid rise in obesity and diabetes.

An ancestral framework can help us analyze the differences between how this food environment may be similar to or different from our current one, without having to invoke a past that didn’t exist, as the plant-based folks must in light of this information—or a past that is so distant that it’s hard to say what we really know about it [as the paleo folks must]. On the other hand, the 1955 –style 50/50 diet looks remarkably familiar. It’s not that hard. Or is it?

[Reason 6:] Well, we make it hard by invoking food rules that don’t always make a lot of sense. Everyone’s current favorite, on all sides of the nutrition issue, is: Avoid processed foods.

Michael Pollan says avoid processed foods unless you are talking about vegetable oils.

Paleoista says avoid processed foods unless you are talking about hydrolyzed fish protein powder.

Problem: Food rules means splitting hairs, drawing lines in the sand, and creating arbitrary divisions—and they usually end up making the food rule makers look silly at best and hypocritical at worst. Food rules are the easiest things to dismiss, discount, or disprove. We’re already enmeshed in a set of arbitrary, unreasonable, and incoherent standards [called the Dietary Guidelines for Americans]; no one is interested in a new and different one.

Skip the food rules. What we need are guiding principles from an ancestral health perspective that can apply to individuals, industry, and policymaking processes. For instance, if we frame concerns around the “recently acquired dietary deviations” I just mentioned, we have a guiding principle—upon which Geoffrey Rose, Gary Taubes and Michael Pollan all agree—for looking at the current scientific literature and for conducting future investigations. We might go back a few generations or many generations; either way we can remain true to our generational perspective of health without limiting ourselves to a particular set of food rules.

[Reason 5:] The politics of responsibility are a no-win situation for the public.

Mainstream nutrition assures folks that, if the low-fat, low-calorie diet isn’t working for you, you’re not doing it right. Paleo people assure newbies that if the high-fat, no-calorie-counting paleo diet isn’t working for you, you’re not doing it right.

And when that logic doesn’t fly, both groups blame the “obesogenic” environment.

Problem: Both approaches assume that “If only that poor sick, fat person had the “right” food or the “right” information or the “right” environment, they’d stop being so fat and sick.” These approaches call for policy reforms that will force industry to make “the healthy choice the easy choice” for people apparently deemed too irresponsible or stupid to make the healthy choice otherwise. But industry is responsible to the public, not for the public. That’s the job of public health.

Challenge: An ancestral health approach recognizes that poor health may be as much an outcome of environmental impacts and generational health—especially prenatal health–as food choices and activity. This shifts the focus away from the politics of responsibility and puts the attention on food industry and policy reform where it belongs, not on a product—which the consumer may or may not choose—but on the processes over which consumers have little control: federal approval of food additives, food and farm workers rights, food safety and food waste, environmental impacts of our current agricultural practices, and many other food-related practices, program, and polices that have been ignored in favor of telling people what to eat and do and blaming them when it doesn’t work.

[Reason 4:]  This one is a real “I’m rubber, you’re glue” thing. We complain about all those mainstream nutrition articles making sweeping generalizations about how animal fats will kill you—then we turn around and make sweeping generalizations about how vegetable oils will kill you. The vast majority of these claims—on both sides of the table–are unproven and even untested; in many cases they are untestable. [The science for both claims is primarily observational; other science may be experimental, but based on animal models and cell cultures. The few randomized, controlled dietary trials that exist are just that, highly controlled. The populations may or may not be generalizable to larger populations; the methods may or may not translate to the "real world."]

Problem is, we don’t know what we think we know about the relationships between diet and health. Plus, there’s a really good chance we will never know what we think we need to know about the relationships between diet and health.

Science and medicine as they have been practiced in America for the past half a century (or more) have relied on a mechanistic approach to these relationships that is now rapidly giving way to more complex thinking. The mechanistic approach has served the industries of research, medicine, food and pharmaceuticals–because what is simplified can be controlled–but it hasn’t served the health of humans.

Ancestral health principles can help us think about science differently. Nutrition science as it is practiced now is backwards looking—especially nutrition epidemiology which relies upon ancient datasets gleaned from populations which are hardly representative of our current world. It ignores the complex relationships between ourselves, our environment, and our heredity that science has more recently uncovered. Despite its name, ancestral health represents a forward-looking framework. As an approach to public health, it can herald a shift to a more holistic, yet evidence-based focus that recognizes individual, community, environment, and generational impacts on health. Consider the ancestral health community’s active encouragement of n of 1 experimentation. It is a perspective that can go beyond Joe Paleo fiddling with his macronutrient ratios to a place of leveraging new biomedical technology, new ways of modeling complex relationships, and a new focus on patient-centered outcomes to create a revolution in how we approach the science of diet and health. This is not anti-science, but an embrace of science in all its complexity. Such an approach brings us to our biggest philosophical challenge:

[Reason 3:] Can we acknowledge that one diet will not be right for everybody?

Right now, mainstream nutrition asserts that everyone will benefit from eating a low-fat, low-calorie diet.

At the same time, the paleo community asserts that everyone will benefit from eating a paleo diet.

The problem with a top-down, unilateral imposition of one-size-fits-all dietary recommendations is the same as it was in 1977: Who asked you to come up with a diet for me that might or might not help prevent a condition that I may or may not be concerned about? Remember that a skeptical public doesn’t want to get fooled again. New arrivals to our country, who aren’t yet aware of the abysmal failure of our current nutrition system, are being greeted with admonitions to give up traditional foods like eggs and meat—but then paleo doesn’t have a much different message to offer, except that instead they should give up traditional foods like bread and beans.

Ancestral health principles embrace the notion of change. Ancestral health acknowledges complexity. It only makes sense that an ancestral health approach to public health would recognize diverse paths to acquiring appropriate nutrition, with a focus on foods high in nutrient value, and frame dietary information in terms of the diversity of individual, cultural, environmental, and generational contexts. But will it?

[Reason 2:]  Many of the assumptions I’ve mentioned are deeply embedded in our thinking, and reflect the concerns, values, and social power of the mostly white, well-educated, well-paid, predominantly female thirty-somethings that make up the paleo community. Not that there’s anything wrong with that—information from other datasets have shown that white, well-educated women are also the ones that most closely adhere to the Dietary Guidelines food pattern, so the presence of this demographic in paleo may reflect an overall concern not only for weight and appearance, but for family and health. This is a good thing. This particular demographic also has a long history of being the backbone of successful social reform movements—from child labor to drunk driving laws.

But ladies—and gentlemen—we are going to have to do more than vote with our forks or food dollars.

Both paleo and plant-based reform efforts seem to believe that your financial support of the food you’d like to see other people eating is the best way to change the food-health system. You can just munch your way to a better world without ever having to encounter anyone who doesn’t appreciate the change you’re creating for them.

For paleo eaters, increased demand may increase production, making some foods more affordable for some people. It may support some farmers—as long as they keep up with and adhere to all of the “appropriate” [and possibly contradictory, unrealistic, and/or absurd] paleo food rules—but it isn’t necessarily going to change the status quo for the most vulnerable in our population, the ones most subject to the effects of dietary policy as it stands now. Me buying my eggs locally doesn’t help the low-income mothers who would like to spend their federal assistance farmers market vouchers on local eggs too, which they are not allowed to do. Face it, in the “vote with your food dollar” approach, some folks have a lot more votes than others. Changing your diet is not enough to change the world. We are going to have to put down our forks and dig in.

One of the things any successful social change effort has is a story, where the victims of injustice can be rescued from evil by the heroes. A successful social change effort also has a way for everyone—from individuals to the government—to be a hero. This takes the form of a repertoire of actions for changing conditions. These concrete actions give a sense of agency and urgency to the cause; they say to the world: come join us, we are being the change we want to see.

Being a hero and acting from a place of our own food-health values, however, does not mean going out into the world and trying to impose those values on someone who hasn’t asked for our help. Instead, it means sharing the privilege of health we have in a useful way [and this is a privilege based much more on social class than diet], so that others may have the food and the health that they want—just as we wish to have the food and health that we want. How can we do that?

For example: An ancestral health framework recognizes the importance of protein as essential to a nutritionally-adequate diet. But protein is also the single most expensive food source to provide to the less fortunate. Because it is so expensive, it also means that protein is the food source most lacking in diets of those who are in most need.

The state of Illinois has established a program to encourage hunters and anglers to donate deer and Asian carp—which is an invasive species in the Great Lakes–for processing into healthy, ready-to-serve meals. I don’t know what their standards for that are, but if you work to build a similar program in your area—or maybe you’ll head up a protein food drive for a local shelter–you get to help set the standards, remembering that the goal is not necessarily following all the “right” food rules, but feeding the hungry essential nutrition.

[A number of states have programs--with various names, but often called "Hunters for the Hungry"--that bring hunters, processors, state inspectors, and hunger relief organizations together to help supply sources of all-important high-quality protein to those in need.]

Community level programs can ripple outward and upward – and if they are organized with an ancestral framework in mind, those ideas ripple outward and upward as well.

Farm to Family initiatives bring food from local farmers to local, low-income families at prices they can afford—an effort that supports local farmers as well as community members at risk for hunger and poor nutrition. These initiatives typically focus on fresh produce, but some include meat and eggs—and wouldn’t the world be a better place if even more of them did? College students with mad social networking skills can mobilize volunteers and connect resources to get the program off the ground. Local public health agencies and faith-based organizations can raise awareness so that families at highest risk can be reached—and so their wants and needs can be heard and honored. Individuals and families can donate time and money, while businesses can facilitate logistics with donations of materials or space. Feedback from the community can support policy change at local, state, and federal levels.

The ancestral health community has the sort of talent to pull an effort like this off, but it involves not just getting out of the house, but getting out of our comfort zones.

[Reason 1:] The lack of diversity that often comes with being part of a community of like-minded people presents both an epistemic challenge and a logistical one. It can lead not only to closed minds, but to closed doors. Being able to act from a place of ancestral health principles—rather than paleo rules—can make it easier to reach out to others–the final thing needed to build a social movement.

Confirmation bias has been a pervasive aspect of mainstream nutrition, and in opposition to it, paleo culture often seems to have adopted a similarly insular stance. It can be reinforced by influence and funding, but most often it is simply a way of not being challenged in our own beliefs.

In mainstream nutrition, the USDA and HHS write the Dietary Guidelines. They also finance the research and the experts that they later choose for their “evidence-based analysis” of these guidelines, so it’s no surprise that both the research and the experts support the status quo.

Paleo leaders also have a vested financial interest in being paleo leaders—books, speaking engagements, products, and other various funding streams—just as paleo followers have an interest in remaining comfortable in their chosen ideology. We support our leaders; they tell us what we want to hear.

This problem, also known as epistemic closure, echo chambers, or a circle jerk, is that these positive feedback loops end up welcoming only people that think exactly like the people already in the group. Sadly, the smarter you are, the better you are at confirming your own beliefs about things—and we have a lot of wicked smart people in the paleo community. Unfortunately, circle jerks quickly turn into cluster, let’s call them “efforts” – where the circle of closed thinking causes the very problems that the circle of closed thinking is unable to address exactly because of its closed nature. Which is sort of where we are now—both in mainstream nutrition and in paleo.

Much of mainstream nutrition has built-in alliances with academia, industry, advocacy groups, and policymakers. In order to make our voices heard, we will need to establish connections with other communities who will work with us on common issues. The general rule in building networks of alliances is that there are no permanent friends and no permanent enemies; everyone is a future ally. You work together on issues and projects as long as your goals align.

This may make for strange bedfellows at times, but if we want to be more than a passing fad, we are going to have to reach out of our comfort zone and connect with other communities with whom we may not feel an immediate kinship but with whom we share some core values.

For example, the Health at Every Size community. This community has a strong presence in academic circles that look at feminist and diversity issues. While an alliance based on paleo thinking might not make sense, the ancestral health framework would have much in common with these Health at Every Size principles.

The Invest in Healthy Food Project being promoted by the Union of Concerned Scientists uses MyPlate as its nutrition reference point. Icky, right? But a closer look shows a focus on policy change that is fully compatible with ancestral health principles. Specifically citing the need for changes to commodity crop policies and crop insurance that would benefit the local farmers that we support.

Other communities with whom we are likely to have some common objectives are: other alternative food movements–yes, including vegans; sustainable agriculture and permaculture communities; government accountability groups; and hunger groups. We don’t have to agree on everything, just our shared goals. We can learn from them and they can learn from us.

We can reach out to foundations, the media, professional organizations, and faith-based communities. And it doesn’t have to be on a national level. We can find influential allies in these groups in our own local communities.

And in fact, that’s where I would urge us to start. As a community, we exist both nowhere and everywhere—which can make us feel more at home at places like AHS than we do in our own towns. But, to quote Rick Ingrasci, if you want to create a new culture—throw a better party. One of the wonderful traditional things we do as humans is celebrate and build community with food—but it’s hard to celebrate if you are busy agonizing, analyzing, and criticizing your—or your neighbor’s—food. We have the opportunity to NOT be those nutrition reform people.

I’m going to end with a story about last year’s Food Day in Durham, NC. This is sponsored by the Center for Science in the Public Interest, which operates from a plants-are-better, saturated-fat-kills perspective. At an organizational meeting last year, a room full of young white women—and one white male—were busy wringing their hands over the lack of diversity at last year’s Food Day events. Now Durham is a very diverse little city. In Durham, we talk more about race than NASCAR fans talk about racing. But Food Day tends to be an almost all-white event involving mostly college kids from Duke rather than people from the community. Why oh why is that? these ladies (and one gentleman) wanted to know. I suggested that maybe it’s because no Food Day events serve meat—and there are lots of local meat, egg, and cheese producers that we could support by promoting their foods. These women looked at me as if I had just created a loud, legume-based bodily emission—and the topic was never mentioned again.

Well, we can throw a better party. We can appeal to a wider, more diverse, and inclusive community. It will mean growing up, digging in, and reaching out. But there are plenty of people out there who are hungry for a sense of identity, for connection, and for change. Ancestral health as a social movement can serve that purpose, as well as serve our communities—and we can serve it with a side of bacon.

The “thank you” slide is my shout-out to those who have helped me think about the issues I’ve raised.

Laura Schoenfeld @ Ancestralize Me!

Beth Mazur @ Weight Maven

Melissa McEwan @ Hunt Gather Love

Robert Patterson @ Michael Rose’s 55

Chris Masterjohn @ The Daily Lipid

Doug Imig – The Urban Child Institute

Andrew Abrahams – Long Dream Farm

Michael Ostrolenk – The Transpartisan Center

Postscript: At some point during the AHS 2013 weekend, I pulled Aaron Blaisdell aside and asked him what the deal was with paleo and AHS. Here’s his response as I remember it (and I hope he will correct me if I misrepresent him). He said something to effect of: AHS is about bringing an evolutionary perspective to health, including but not limited to matters relating to diet and nutrition. Darwin’s evolutionary perspective has been an incredibly powerful tool in other areas of biology for understanding why things are the way they are and for formulating hypotheses and testing them out, but it is often neglected when it comes to health particularly in matters of food and diet. AHS is about promoting that perspective, not about promoting a particular diet. [See Aaron's comments below for an expansion on this.  Note to self:  Drink that glass of wine after you ask Aaron Blaisell questions like that.]

I heaved a big sigh of relief. “Paleo” I can do without–just as I can do without all of those other conveniently-labeled approaches to diet and health with massive cognitive bias blind spots: vegan, vegetarian, low-carb, low-fat, “eating the food,” whatever, whatever (although I’m happy for the people who find that being part of those communities gets them on a path to health that works for them). So I guess this is my massive cognitive bias blind spot. I still love those AHS folks.

Make me some science I can’t refuse

In case you missed it, in a recent article published in the American Journal of Preventive Medicine entitled Overstatement of Results in the Nutrition and Obesity Peer-Reviewed Literature (not making this up), the authors found that a lot of papers published in the field of obesity and nutrition have, shall we say, issues.

Well–as they say down South– I never!

The authors looked at over 900 scientific articles on nutrition or obesity published either in 2001 or 2011 in leading journals. They found that about 1 in 11 include “overreaching statements of results.” 

Here’s how the authors described statements that would be coded as “overreaching”:

  • reporting an associative relationship as causal
  • making policy recommendations based on observational data that show associations only (e.g., not cause and effect)
  • inappropriately generalizing to a population not represented by the sample studied

Frankly, I am totally offended. Someone needs to let these folks know that, in nutrition epidemiology, correlation actually does equal causation.

What’s more, nutrition policy recommendations are supposed to be based on observational data. Hello? Dietary Guidelines? (Seriously. You don’t expect public health nutrition people to do actual experiments now, do you? I mean, unless you are talking about our population-wide, no-control-group, 35-year experiment with low-fat diet recommendations, but that’s different.)

And we don’t mind generalizing conclusions to Everyone in the Whole Wide World based on data from a bunch of white health care professionals born before the atom bomb because, honestly, those are the only data we really care about.

Equating correlation and causation, over-generalizing observations, and then using these results as the basis of policy is the bread (whole wheat) and butter (substitute) of nutrition epidemiology of chronic disease (aka NECD – pronounced Southern-style as “nekked”). NECD has a long proud tradition of misinterpreting results this way, and dammit, nobody is going to take that away from us.

Early NECD researchers have in the past tried to tentatively misinterpret results by obliquely implying that observed nutritional patterns might perhaps have resulted in the disease under investigation. Wusses.

In 1990, Walter Willett and JoAnn Manson came along to show us how the pros do it. These mavericks were the ones who made bold inroads into the kind of overreaching conclusions that made NECD great. Their data come from an observational study of female registered nurses from 11 states in the US, born between 1921 and 1946, who were asked to remember and report what they ate 4 whole times between 1976 and 1984, plus remember and report what they weighed when they were 18 years old. From this dataset, which is clearly comprehensive, and this population, which is practically every female in the US, Willett, Manson and company naturally conclude that “obesity is a major cause of excess morbidity and mortality from coronary heart disease among women in the United States” (emphasis mine). None of this wimpy “associated with increased risk of” bullshooey, obesity CAUSES heart disease, they tell us, CAUSES IT!!!! BWHAAAHAAAAA!!!!!!!

It is on this foundation of intrepid willingness to misinterpret data that the science of NECD was built. This is why Walter Willett is the Big Kahuna at the Harvard School of Public Health. He has demonstrated the courage to misinterpret data in innovative and comprehensive ways, publishing articles throughout his career that indicate that even small increases in BMI—including BMI levels that are currently considered “normal”–cause chronic disease.

In 1999, in what is considered a landmark article in overstatement, one with which all NECD acolytes should familiarize themselves, he states unequivocally, in a review of observational data:

“Excess body fat is a cause of cardiovascular diseases, several important cancers, and numerous other medical conditions . . . “ (my emphasis). Hmmmm. Observed associations reported as causal? Ding!

The rest of that sentence reads: ” . . . and is a growing problem in many countries.” His data is once again gathered mostly from American white health care professionals born before the atom bomb. Generalization from specific populations to the rest of the world? Ding ding.

And what should we do with this conclusion, according to Willett? “Preventing weight gain and overweight among persons with healthy weights and avoiding further weight gain among those already overweight are important public health goals.” Using observed associations to make policy recommendations? Ding ding ding. In one fell swoop, Willett dexterously manages to use all three designated methods of overstatement and misinterpretation in the nutrition epidemiology NECD toolbox, demonstrating why he is considered by most researchers to be “the ‘father’ of nutrition epidemiology.” This man overstates and misinterprets in ways that the rest of us can only dream of doing.

Sadly, some epidemiologist have failed to follow in Willett’s footsteps. In January 2013, Katherine Flegal, an epidemiologist at the Centers for Disease Control and Prevention and the woman who first noted the remarkably rapid rise in obesity that began in the decade following the release of the 1977 Dietary Goals for Americans, published results that concluded that being overweight (or even mildly obese) is associated with a lower risk of death. At no point in her article does she suggest that overweight or obesity results in increased lifespan.

The response from Harvard? Walter Willett calls Flegal’s article ” a pile of rubbish” and insists that “no one should waste their time reading it” and rightly so. Why would anyone want to hear about “associations”? What kind of nonsense is that? Obviously Flegal lacks the professionalism it takes to make the leap from observation to causation.

But that’s okay. Willett and the Harvard Family know how to deal with this sort of thing.

“Someday, and that day appears to have come, I will call upon you to ignore the work of other scientists when their results contradict my own.”

Let’s face it, in the world of NECD, you can’t just have people like Flegal refusing to infer causation from observed results, just because they don’t want to. When that sort of thing happens, well, let’s just say, if she won’t do it, the Harvard Family will have to do it for her. And so they did.

In February 2013, Willett and company convened a Harvard Family gathering to, in their words, “elucidate inaccuracies in a recent high-profile JAMA article [i.e. Flegal's] which claimed that being overweight leads to reduced mortality” (emphasis mine). Which it didn’t–except now, voila, it does. It’s not personal, Dr. Flegal. It’s strictly science.

The Family get-together was held at the Harvard School of Public Health, a “neutral convening space” that is also ground zero for the Nurses’ Health Study I and II, the Physicians Health Study I and II, and the Health Professional Follow Up Study, three datasets that have generated many NECD articles that, unlike Flegal’s article, brilliantly illustrate the powers of misinterpreting observational data. That Flegal herself was invited, but “could not attend” tells us just how ashamed she must be of her inability to make over-reaching conclusions–or perhaps she was temporarily “incapacitated” if you know what I mean.

The webcast from the meeting show us how NECD should be done, with dazzling examples of overstatement and marvelous feats of misinterpretation.

In the world of NECD, PowerPoint arrows are a scientifically-acceptable method of establishing causation.

In her shining moment, Dr. JoAnn Manson, demonstrating that she has learned well from Willett, points to the slide above and asks: “How is it possible that overweight and obesity would cause all of these life-threatening conditions, increase their incidence, and then reduce mortality?” How indeed???

The panelists highlighted the importance of maintaining clear standards of overstatement and expressed concern that Flegal’s research could undermine future attempts of more credible researchers to misinterpret data as needed to protect the health of the public.

Because that’s what it’s all about folks: protection. Someone needs to protect the science from renegades like Flegal, and someone needs to protect the public from science.

We should be thankful that we have Willett and the Harvard Family there. They know that data like Flegal’s can only confuse the poor widdle brains of Americans. Allowing us to be exposed to such “rubbish” might lead us to the risky conclusion that perhaps overweight and mild obesity won’t cause all of us to die badly, or to the even more dangerous notion that observational data should remark only upon association, not causation. And we sure don’t want that to happen.

As Don Dr. Willett says, “It is important for people to have correct information about the relationship between health and body weight.” And when he wants us to have the correct information about the relationship between health and body weight, he’ll misinterpret it for us.

Take the science, leave the cannoli.

Never Too Early to Learn about Lowfat?

I am pleased to welcome Pam Schoenfeld MS RD as a guest blogger.  Pam has been inspiration to me for many years; she was the person who convinced me I could go back to school and get a degree in nutrition.  She works with individuals and families in a clinical setting and has become increasingly concerned about the messages about “healthy food” that are being given to young children.  Although the science on the dangers of dietary saturated fat and cholesterol has been hotly contested for decades, mainstream nutrition is now targeting preschoolers with messages about the evils of eggs and whole milk.  Pam shares her experiences and insights on this issue.  

As for me, I love my new PhD program in Communication, Rhetoric, and Digital Media, but it is kicking my butt.  Who knew critical cultural theory was harder than biochemistry?  I’m glad to have someone take over blogging duties for me as I am sleep-deprived, overworked, overwhelmed–and happier than a pig in slop.  Pam promises more to come, so stay tuned.

Samantha was in most ways a typical patient, slightly round in the middle but otherwise healthy. She knew a few things about healthy eating; she ate cereal with nonfat milk for breakfast, and made an effort to eat her fruits and vegetables. When I asked if she liked eggs, she said “yes, but only the whites.” When I asked why, she answered rather matter-of-factly, “the white is better for you than the yellow.” Her answer came as a surprise to me. Many of my patients are still concerned about eating egg yolks, but Samantha was only eight years old! Already she had somehow internalized the message that certain foods were best to avoid if you want to be healthy. She did not know that egg yolks were high in cholesterol; just that they were not good for her to eat. She did not know that nonfat milk was low in saturated fat; just that it was what her family always poured on their cereal and what she drank at school.

Despite the lack of science to support the claim, kids are increasingly being given the message that fat-free milk–even the kind with added sugar–is the healthiest choice.

The avoidance of egg yolks and the choice of low-fat or nonfat milk are so common among my patients that if she were just 5-10 years older, her answer would be completely expected. But she was so young, so eager to do the right thing, and yet so unaware that some of what she was being taught about nutrition was not evidence-based. So while I helped her plan meal choices that better met her needs, I gave her my best third-grader explanation about why whole eggs are actually one of the best foods she could eat. I left the subject of dairy fat for another day, as I suspected her mother and pediatrician wouldn’t agree with my view on this and I wanted to ensure that Sam would continue to see me.

I had assumed that these nutrition messages are so prevalent in our adult culture and media that young children simply absorb them by osmosis. But it turns out that more and more children as young as 3 are being targeted with nutrition information.

Nutrition information that adheres to controversial government dietary recommendations is targeting preschoolers with low-fat, grain-based dietary advice.

The latest issue of my professional dietetics journal arrived in the mail last month, and as usual, I skimmed the abstracts of poster presentations scheduled for the annual dietetics conference, aka “FNCE,” that will be held in Houston over the next 5 days. To my surprise, a few dozen of these abstracts described research on children’s diets, with eight reporting outcomes from programs targeted for preschool or grade-school age groups. (1) FNCE is by far the largest annual gathering of registered dietitians; it is at this venue that many RDs become informed on the research and practice recommendations in our field.

One group of researchers stated that because 75% of children are in organized childcare, it is the ideal setting for promoting healthy behaviors; a second group agreed that childcare settings are a prime environment for early intervention. I was reminded of my own dietetic internship, where I had to sing to Head Start pupils about the merits of low-fat milk while entertaining them with a cow puppet. Nutrition and health lessons directed to preschoolers are commonly delivered in the form of games and songs, but researchers are now studying the effectiveness of other methods.

I guess this should not surprise me, considering a 2011 Institute of Medicine (IOM) Report entitled “Early Childhood Obesity Prevention Policies.” The expert committee authoring the report stated “there is a growing awareness that efforts to prevent childhood obesity must begin before children even enter the school system.” Their “hope” is that this report will find its way to government policy makers who work in areas that impact young children in infancy and early childhood. In this report, nutritious and healthy foods for ages 2 and older are defined to be consistent with the Dietary Guidelines, which specify lean protein foods and low-fat or nonfat dairy products. (2)

Traditionally, the family has been the key environment where young children learn to develop eating habits and food preferences. But once children start school, teachers and peers gradually become the greatest influence. (3) Most people would undoubtedly be supportive of any initiative to educate young children about nutrition. After all, there has been an almost 2.5-fold increase in obesity in children ages 2-5 from 1980 to 2010, from 5% to 12.1% of this age group, and similar increases in older children. (4) So it would appear necessary to begin preventative measures at an early age.

Despite considerable evidence that shows that saturated fats are not linked to heart disease, the American Heart Association uses cartoon figures to teach kids that both whole (saturated) fats from animal products and transfats from processed oils are “the Bad Fats Brothers.”

It also appears that initiatives directed at young children are effective. In a recent study of 4-year olds given structured nutrition lessons in preschool, the children were able to correctly answer that “high-fat foods are bad for you and make you fat” even 5 months after the lessons ended. These lessons were only 10-15 minutes long, and the information wasn’t reviewed during the 5-month period, so the children’s ability to retain that lesson long-term indicates their receptivity to simple nutrition messages. (5) While it is unknown if the children consistently acted on this knowledge, it is clear they can and do retain simple “food rules,” even at the tender age of 4. The IOM committee would likely agree: “During infancy and early childhood, lifestyle behaviors that promote obesity are just being learned, and it is easier to establish new behaviors than to change existing ones.” (2)

So if childhood obesity is a huge problem and these early nutrition programs are effective in teaching children, why am I concerned? One reason is that the very same saturated fat- and cholesterol-containing foods that are negatively targeted in these nutrition lessons actually contain critical nutrients for growing children. Another reason is that if done improperly, nutrition lessons directed at children could easily pave the way for unhealthy relationships with food and issues with body image, among other unintended effects.

I will discuss these possibilities in more depth in upcoming posts, with further discussion on some disturbing recommendations from the IOM Early Childhood Obesity Prevention report.

References:

1. Journal of the Academy of Nutrition and Dietetics; 2013:113(9), A1-A120, suppl.

2. Institute of Medicine (IOM). 2011. Early Childhood Obesity Prevention Policies.Washington, DC: The National Academies Press.

3. Perez-Rodrigo C, Aranceta J. Public Health Nutrition. 2001;4(1A), 131-139.

4. Ogden CL, Carroll MD, Kit BK, Flegal KM. JAMA. 2012;307(5):483-490.

5. Nguyen SP, McCullough MB, Noble A. J Educ Psychol. 2011; 103(3): 594–606.

As the Calories Churn (Episode 3): The Blame Game

In the previous episode of As the Calories Churn, we explored the differences in food supply/consumption between America in 1970 and America in 2010.

We learned that there were some significant changes in those 40 years. We saw dramatic increases in vegetable oils, grain products, and poultry—the things that the 1977 Dietary Goals and the 1980 Dietary Guidelines told us to increase. We saw decreases in red meat, eggs, butter, and full-fat milk—things that our national dietary recommendations told us to decrease. Mysteriously, what didn’t seem to increase much—or at all—were SoFAS (meaning “Solid Fats and Added Sugars”) which, as far as the 2010 Dietary Guidelines for Americans are concerned, are the primary culprits behind our current health crisis. (“Solid Fats” are a linguistic sleight-of-hand that lumps saturated fat from natural animal sources in with processed partially-hydrogenated vegetables oils and margarines that contain transfats; SoFAS takes the trick a step further, by being not only a dreadful acronym in terms of implying that poor health is caused by sitting on our “sofas,” but by creating an umbrella term for foods that have little in common in terms of structure, biological function or nutrition.)

Around the late 70s or early 80s, there were sudden and rapid changes in America’s food supply and food choices and similar sudden and rapid changes in our health. How these two phenomena are related remains a matter of debate. It doesn’t matter if you’re Marion Nestle and you think the problem is calories or if you’re Gary Taubes and you think the problem is carbohydrate—both of those things increased in our food supply. (Whether or not the problem is fat is an open debate; food availability data points to an increase in added fats and oil, the majority of which are, ironically enough, the “healthy” monounsaturated kind; consumption data points to a leveling off of overall fat intake and a decrease in saturated fat—not a discrepancy I can solve here.) What seems to continue to mystify people is why this changed occurred so rapidly at this specific point in our food and health history.

Personally responsible or helplessly victimized?

At one time, it was commonly thought that obesity was a matter of personal responsibility and that our collective sense of willpower took a nosedive in the 80s, but nobody could ever explain quite why. (Perhaps a giant funk swept over the nation after The Muppet Show got cancelled, and we all collectively decided to console ourselves with Little Debbie Snack Cakes and Nickelodeon?) But because this approach is essentially industry-friendly (Hey, says Big Food, we just make the stuff!) and because no one has any explanation for why nearly three-quarters of our population decided to become fat lazy gluttons all at once (my Muppet Show theory notwithstanding) or for the increase of obesity among preschool children (clearly not affected by the Muppet Show’s cancellation), public health pundits and media-appointed experts have decided that obesity is no longer a matter of personal responsibility. Instead the problem is our “obesogenic environment,” created by the Big Bad Fast Processed Fatty Salty Sugary Food Industry.

Even though it is usually understood that a balance between supply and demand creates what happens in the marketplace, Michael Pollan has argued that it is the food industry’s creation of cheap, highly-processed, nutritionally-bogus food that has caused the rapid rise in obesity. If you are a fan of Pollanomics, it seems obvious that food industry—on a whim?—made a bunch of cheap tasty food, laden with fatsugarsalt, hoping that Americans would come along and eat it. And whaddaya know? They did! Sort of like a Field of Dreams only with Taco-flavored Doritos.

As a result, obesity has become a major public health problem.

Just like it was in 1952.

Helen Lee in thought-provoking article, The Making of the Obesity Epidemic (it is even longer than one of my blog posts, but well worth the time) describes how our obesity problem looked then:

“It is clear that weight control is a major public health problem,” Dr. Lester Breslow, a leading researcher, warned at the annual meeting of the western branch of the American Public Health Association (APHA).
 At the national meeting of the APHA later that year, experts called obesity “America’s No. 1 health problem.”

The year was 1952. There was exactly one McDonald’s in all of America, an entire six-pack of Coca-Cola contained fewer ounces of soda than a single Super Big Gulp today, and less than 10 percent of the population was obese.

In the three decades that followed, the number of McDonald’s restaurants would rise to nearly 8,000 in 32 countries around the world,
sales of soda pop and junk food would explode — and yet, against the fears and predictions of public health experts, obesity in the United States hardly budged. The adult obesity rate was 13.4 percent in 1960. In 1980, it was 15 percent. If fast food was making us fatter, it wasn’t by very much.

Then, somewhat inexplicably, obesity took off.”

It is this “somewhat inexplicably” that has me awake at night gnashing my teeth.

And what is Government going to do about it?

I wonder how “inexplicable” it would be to Ms. Lee had she put these two things together:

(In case certain peoples have trouble with this concept, I’ll type this very slowly and loudly: I’m not implying that the Dietary Guidelines “caused” the rise in obesity; I am merely illustrating a temporal relationship of interest to me, and perhaps to a few billion other folks. I am also not implying that a particular change in diet “caused” the rise in obesity. My focus is on the widespread and encompassing effects that may have resulted from creating one official definition of “healthy food choices to prevent chronic disease” for the entire population.)

Right now we are hearing calls from every corner for the government to create or reform policies that will reign in industry and “slim down the nation.” Because we’d never tried that before, right?

When smoking was seen as a threat to the health of Americans, the government issued a definitive report outlining the science that found a connection between smoking and risk of chronic disease. Although there are still conspiracy theorists that believe that this has all been a Big Plot to foil the poor widdle tobacco companies, in general, the science was fairly straightforward. Cigarette smoking—amount and duration—is relatively easy to measure, and the associations between smoking and both disease and increased mortality were compelling and large enough that it was difficult to attribute them to methodological flaws.

Notice that Americans didn’t wait around for the tobacco industry to get slapped upside the head by the FDA’s David Kessler in the 1990s. Tobacco use plateaued in the 1950s as scientists began to publicize reports linking smoking and cancer. The decline in smoking in America began in earnest with the release of Smoking and Health: Report of the Advisory Committee to the Surgeon General in 1964. A public health campaign followed that shifted social norms away from considering smoking as an acceptable behavior, and smoking saw its biggest declines before litigation and sanctions against Big Tobacco  happened in the 1990s.

Been there, done that, failed miserably.

In a similar fashion, the 1977 Dietary Goals were the culmination of concerns about obesity that had begun decades before, joined by concerns about heart disease voiced by a vocal minority of scientists led by Ancel Keys. Declines in red meat, butter, whole milk and egg consumption had already begun in response to fears about cholesterol and saturated fat that originated with Keys and the American Heart Association—which used fear of fat and the heart attacks they supposedly caused as a fundraising tactic, especially among businessmen and health professionals, whom they portrayed as especially susceptible to this disease of “successful civilization and high living.”  The escalation of these fears—and declines in intake of animal foods portrayed as especially dangerous—picked up momentum when Senator George McGovern and his Select Senate Committee created the 1977 Dietary Goals for Americans. It was thought that, just as we had “tackled” smoking, we could create a document advising Americans on healthy food choices and compliance would follow. But issue was a lot less straightforward.

To begin with, when smoking was at its peak, only around 40% of the population smoked. On the other hand, we expect that approximately 100% of the population eats.

In addition, the anti-smoking campaigns of the 1960s and 1970s built on a long tradition of public health messages—originating with the Temperance movement—that associated smoking with dirty habits, loose living, and moral decay. It was going to be much harder to fully convince Americans that traditional foods typically associated with robust good health, foods that the US government thought were so nutritionally important that in the recent past they had been “saved” for the troops, were now suspect and to be avoided.

Where the American public had once been told to save “wheat, meat, and fats” for the soldiers, they now had to be convinced to separate the “wheat” from the “meat and fats” and believe that one was okay and the others were not.

To do this, public health leaders and policy makers turned to science, hoping to use it just as it had been used in anti-smoking arguments. Frankly, however, nutrition science just wasn’t up to the task. Linking nutrition to chronic disease was a field of study that would be in its infancy after it grew up a bit; in 1977, it was barely embryonic. There was little definitive data to support the notion that saturated fat from whole animal foods was actually a health risk; even experts who thought that the theory that saturated fat might be linked to heart disease had merit didn’t think there was enough evidence to call for dramatic changes in American’s eating habits.

The scientists who were intent on waving the “fear of fat” flag had to rely on observational studies of populations (considered then and now to be the weakest form of evidence), in order to attempt to prove that heart disease was related to intake of saturated fat (upon closer examination, these studies did not even do that).

Nutrition epidemiology is a soft science, so soft that it is not difficult to shape it into whatever conclusions the Consistent Public Health Message requires. In large-scale observational studies, dietary habits are difficult to measure and the results of Food Frequency Questionnaires are often more a product of wishful thinking than of reality. Furthermore, the size of associations in nutrition epidemiological studies is typically small—an order of magnitude smaller than those found for smoking and risk of chronic disease.

But nutrition epidemiology had proved its utility in convincing the public of the benefits of dietary change in the 70s and since then has become the primary tool—and the biggest funding stream (this is hardly coincidental)—for cementing in place the Consistent Public Health Message to reduce saturated fat and increase grains and cereals.

There is no doubt that the dramatic dietary change that the federal government was recommending was going to require some changes from the food industry, and they appear to have responded to the increased demands for low-fat,whole grain products with enthusiasm. Public health recommendations and the food fears they engendered are (as my friend James Woodward puts it) “a mechanism for encouraging consumers to make healthy eating decisions, with the ultimate goal of improving health outcomes.” Experts like Kelly Brownell and Marion Nestle decry the tactics used by the food industry of taking food components thought to be “bad” out of products while adding in components thought to be “good,” but it was federal dietary recommendations focusing above all else on avoiding saturated fat, cholesterol, and salt that led the way for such products to be marketed as “healthy” and to become acceptable to a confused, busy, and anxious public. The result was a decrease in demand for red meat, butter, whole milk and egg, and an increase in demand for low-saturated fat, low-cholesterol, and “whole” grain products. Minimally-processed animal-based products were replaced by cheaply-made, highly-processed plant-based products, which food manufacturers could market as healthy because, according to our USDA/HHS Dietary Guidelines, they were healthy.

The problem lies in the fact that—although these products contained less of the “unhealthy” stuff Americans were supposed to avoid—they also contained less of our most important nutrients, especially protein and fat-soluble vitamins. We were less likely to feel full and satisfied eating these products, and we were more likely to snack or binge—behaviors that were also fully endorsed by the food industry.

Between food industry marketing and the steady drumbeat of media messages explaining just how deadly red meat and eggs are (courtesy of population studies from Harvard, see above), Americans got the message. About 36% of the population believe that UFOs are real; only 25% believe that there’s no link between saturated fat and heart disease. We are more willing to believe that we’ve been visited by creatures from outer space than we are to believe that foods that humans have been eating ever since they became human have no harmful effects on health. But while industry has certainly taken advantage of our gullibility, they weren’t the ones who started those rumors, and they should not be shouldering all of the blame for the consequences.

Fixing it until it broke

Back in 1977, we were given a cure that didn’t work for diseases that we didn’t have. Then we spent billions in research dollars trying to get the glass slipper to fit the ugly stepsister’s foot. In the meantime, the food industry has done just what we would expect it to do, provide us with the foods that we think we should eat to be healthy and—when we feel deprived (because we are deprived)—with the foods we are hungry for.

We can blame industry, but as long as food manufacturers can take any mixture of vegetable oils and grain/cereals and tweak it with added fiber, vitamins, minerals, a little soy protein or maybe some chicken parts, some artificial sweeteners and salt substitutes, plus whatever other colors/preservatives/stabilizers/flavorizers they can get away with and still be able to get the right profile on the nutrition facts panel (which people do read), consumers–confused, busy, hungry–are going to be duped into believing what they are purchasing is “healthy” because–in fact–the government has deemed it so. And when these consumers are hungry later—which they are very likely to be—and they exercise their rights as consumers rather than their willpower, who should we blame then?

There is no way around it. Our dietary recommendations are at the heart of the problem they were created to try to reverse. Unlike the public health approach to smoking, we “fixed” obesity until it broke for real.

As the Calories Churn (Episode 2): Honey, It’s Not the Sugar

In the previous episode of As the Calories Churn, we looked at why it doesn’t really make sense to compare the carbohydrate intake of Americans in 1909 to the carbohydrate intake of Americans in 1997.  [The folks who read my blog, who always seem to be a lot smarter than me, have pointed out that, besides not being able to determine differing levels of waste and major environmental impacts such as a pre- or early-industrial labor force and transportation, there would also be significant differences in:  distribution and availability; what was acquired from hunted/home-grown foods; what came through the markets and ended up as animal rather than human feed; what other ingredients these carbohydrates would be packaged and processed with; and many other issues.  So in other words, we not comparing apples and oranges; we are comparing apples and Apple Jacks (TM).]

America in 1909 was very different from America in 1997, but America in 1970 was not so much, certainly with regard to some of the issues above that readers have raised.  By 1970, we had begun to settle into post-industrial America, with TVs in most homes and cars in most driveways.  We had a wide variety of highly-processed foods that were distributed through a massive transportation infrastructure throughout the country.

Beginning in the mid-1960s, availability of calories in the food supply, specifically from carbohydrates and fats had begun to creep up.  So did obesity.  It makes sense that this would be cause for concern from public health professionals and policymakers, who saw a looming health crisis ahead if measures weren’t taken–although others contended that our food supply was safer and more nutritious than it had ever been and that public health efforts should be focused on reducing smoking and environmental pollutants.

What emerged from the political and scientific tug-of-war that ensued (a story for another blog post) were the 1977 Dietary Goals for Americans.  These goals told us to eat more grains, cereals and vegetable oils and less fat, especially saturated fat.

Then, around 1977 – 1980, in other words around the time of the creation of the USDA’s recommendations to increase our intake of grains and cereals (both carbohydrate foods) and to decrease our intake of fatty foods, we saw the slope of availability of carbohydrate calories increase dramatically, while the slope of fat calories flattened–at least until the end of the 1990s (another story for another blog post).

[From food availability data, not adjusted for losses.]

The question is:  How did the changes in our food supply relate to the national dietary recommendations we were given in 1977?  Let’s take a closer look at the data that we have to work with on this question.

Dear astute and intelligent readers: From this point on, I am primarily using loss-adjusted food availability data rather than food availability data. Why? Because it is there, and it is a better estimate of actual consumption than unadjusted food availability data. It only goes back to around 1970, so you can’t use it for century-spanning comparisons, but if you are trying to do that, you’ve probably got another agenda besides improving estimation anyway. [If the following information makes you want to go back and make fun of my use of unadjusted food availability data in the previous post, go right ahead. In case you didn't catch it, I think it is problematic to the point of absurdity to compare food availability data from the early 1900s to our current food system—too many changes and too many unknowns (see above).  On the other hand, while there are some differences, I think there are enough similarities in lifestyle and environment (apart from food) between 1970 and 2010 to make a better case for changes in diet and health being related to things apart from those influences.]

Here are the differences in types of food availability data: 

Food availability data: Food availability data measure the use of basic commodities, such as wheat, beef, and shell eggs for food products at the farm level or an early stage of processing. They do not measure food use of highly processed foods– –in their finished form. Highly processed foods–such as bakery products, frozen dinners, and soups—are not measured directly, but the data includes their less processed ingredients, such as sugar, flour, fresh vegetables, and fresh meat.

Loss-Adjusted Food Availability: Because food availability data do not account for all spoilage and waste that accumulates in the marketing system and is discarded in the home, the data typically overstate actual consumption. Food availability is adjusted for food loss, including spoilage, inedible components (such as bones in meat and pits in fruit), plate waste, and use as pet food.

The USDA likes to use unadjusted food availability data and call it “consumption” because, well: They CAN and who is going to stop them?

The USDA—and some bloggers too, I think—prefer unadjusted food availability data.  I guess they have decided that if American food manufacturers make it, then Americans MUST be eating it, loss-adjustments be damned. Our gluttony must somehow overcome our laziness, at least temporarily, as we dig the rejects and discards out of the landfills and pet dishes—how else could we get so darn fat?

I do understand the reluctance to use dietary intake data collected by NHANES, as all dietary intake data can be unreliable and problematic  (and not just the kind collected from fat people).  But I guess maybe if you’ve decided that Americans are being “highly inaccurate” about what they eat, then you figure it is okay be “highly inaccurate” right back at Americans about what you’ve decided to tell them about what they eat.  Because using food availability data and calling it “consumption” is to put it mildly, highly inaccurate, by a current difference of over 1000 calories.

On the other hand, it does sound waaaaaay more dramatic to say that Americans consumed 152 POUNDS (if only I could capitalize numbers!) per person of added sweeteners in 2000 (as it does here), than it does to say that we consumed 88 pounds per person that year (which is the loss-adjusted amount). Especially if you are intent on blaming the obesity crisis on sugar.

Which is kinda hard to do looking at the chart below.

Loss adjusted food availability:

Calories per day 1970 2010
Total 2076 2534 +458
Added fats and oils 338 562 +224
Flour and cereal products 429 596 +167
Poultry 75 158 +83
Added sugars and sweeteners 333 367 +34
Fruit 65 82 +17
Fish 12 14 +2
Butter 29 26 -3
Veggies 131 126 -5
Eggs 43 34 -9
Dairy 245 232 -13
Red meat* 349 267 -82
Plain whole milk 112 24 -88

*Red meat: beef, veal, pork, lamb

Anybody who thinks we did not change our diet dramatically between 1970 and the present either can’t read a dataset or is living in a special room with very soft bouncy walls. Why we changed our diet is still a matter of debate. Now, it is my working theory that the changes that you see above were precipitated, at least in part, by the advice given in the 1977 Dietary Goals for Americans, which was later institutionalized, despite all kinds of science and arguments to the contrary, as the first Dietary Guidelines for Americans in 1980.

Let’s see if my theory makes sense in light of the loss-adjusted food availability data above (and which I will loosely refer to as “consumption”).  The 1977 [2nd Edition] Dietary Goals for Americans say this:

#1 – Did we increase our consumption of grains? Yes. Whole? Maybe not so much, but our consumption of fiber went from 19 g per day in 1970 to 25 g per day in 2006 which is not much less than the 29 grams of fiber per day that we were consuming back in 1909 (this is from food availability data, not adjusted for loss, because it’s the only data that goes back to 1909).

The fruits and veggies question is a little more complicated. Availability data (adjusted for losses) suggests that veggie consumption went up about 12 pounds per person per year (sounds good, but that’s a little more than a whopping half an ounce a day), but that calories from veggies went down. Howzat? Apparently Americans were choosing less caloric veggies, and since reducing calories was part of the basic idea for insisting that we eat more of them, hooray on us. Our fruit intake went up by about an ounce a day; calories from fruit reflects that. So, while we didn’t increase our vegetable and fruit intake much, we did increase it. And just FYI, that minuscule improvement in veggie consumption didn’t come from potatoes. Combining fresh and frozen potato availability (adjusted for losses), our potato consumption declined ever so slightly.

#2 – Did we decrease our consumption of refined sweeteners? No. But we did not increase our consumption as much as some folks would like you to think. Teaspoons of added (caloric) sweeteners per person in our food supply (adjusted for waste) went from 21 in 1970 to 23 in 2010.  It is very possible that some people were consuming more sweeteners than other people since those numbers are population averages, but the math doesn’t work out so well if we are trying to blame added sweeteners for 2/3 of the population gaining weight.  It doesn’t matter how much you squint at the data to make it go all fuzzy, the numbers pretty much say that the amount of sweeteners in our food supply has not dramatically increased.

#3 – Did we decrease our consumption of total fat? Maybe, maybe not—depends on who you want to believe. According to dietary intake data (from our national food monitoring data, NHANES), in aggregate, we increased calories overall, specifically from carbohydrate food, and decreased calories from fat and protein. That’s not what our food supply data indicate above, but there you go.

Change in amount and type of calories consumed from 1971 to 2008
according to dietary intake data

There is general agreement , however, from both food availability data  and from intake data, that we decreased our consumption of the saturated fats that naturally occur with red meat, eggs, butter, and full-fat milk (see below), and we increased our consumption of “added fats and oils,” a category that consists almost exclusively of vegetable oils, which are predominantly polyunsaturated and which were added to foods–hence the category title–such as those inexpensive staples, grains and cereals, during processing.

#4 – Did we decrease our consumption of animal fat, and choose “meat, poultry, and fish which will reduce saturated fat intake”? Why yes, yes we did. Calories from red meat—the bearer of the dreaded saturated fat and all the curses that accompany it—declined in our food system, while poultry calories went up.

(So, I have just one itty-bitty request: Can we stop blaming the rise in obesity rates on burgers? Chicken nuggets, yes. KFC, yes. The buns the burgers come on, maybe. The fries, quite possibly. But not the burgers, because burgers are “red meat” and there was less red meat—specifically less beef—in our food supply to eat.)

Michael Pollan–ever the investigative journalist–insists that after 1977, “Meat consumption actually climbed” and that “We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.”   In the face of such a concrete and well-proven assumption, why bother even  looking at food supply data, which indicate that our protein from meat, poultry, fish, and eggs  “climbed” by just half an ounce?

In fact, there’s a fairly convenient balance between the calories from red meat that left the supply chain and the calories of chicken that replaced them. It seems we tried to get our animal protein from the sources that the Dietary Goals said were “healthier” for us.

#5 – Did we reduce our consumption of full-fat milk? Yes. And for those folks who contend this means we just started eating more cheese, well, it seems that’s pretty much what we did. However, overall decreases in milk consumption meant that overall calories from dairy fat went down.

#6 – Did we reduce our consumption of foods high in cholesterol? Yes, we did that too. Egg consumption had been declining since the relative affluence of post-war America made meat more affordable and as cholesterol fears began percolating through the scientific and medical community, but it continued to decline after the 1977 Goals.

#7 – Salt? No, we really haven’t changed our salt consumption much and perhaps that’s a good thing. But the connections between salt, calorie intake, and obesity are speculative at best and I’m not going to get into them here (although I do kinda get into them over here).

food supply and Dietary GoalsWhat I see when I look at the data is a good faith effort on the part of the American people to try to consume more of the foods they were told were “healthy,” such as grains and cereals, lean meat, and vegetable oils. We also tried to avoid the foods that we were told contained saturated fat—red meat, eggs, butter, full-fat milk—as these foods had been designated as particularly “unhealthy.” No, we didn’t reduce our sweetener consumption, but grains and cereals have added nearly 5 times more calories than sweeteners have to our food supply/intake.

Although the America of 1970 is more like the America of today than the America of 1909, some things have changed. Probably the most dramatic change between the America of the 1970s and the America of today is our food-health system. Women in the workplace, more suburban sprawl, changing demographics, increases in TV and other screen time—those were all changes that had been in the works for a long time before the 1977 Dietary Goals came along. But the idea that meat and eggs were “bad” for you? That was revolutionary.

And the rapid rises in obesity and chronic diseases that accompanied these changes? Those were pretty revolutionary as well.

One of my favorite things to luck upon on a Saturday morning in the 70s—aside from the Bugs Bunny-does-Wagner cartoon, “What’s Opera, Doc?“—were the public service announcements featuring Timer, an amorphous yellow blob with some sing-along information about nutrition:

You are what you eat

From your head down to your feet

Thinks like meat and eggs and fish you

Need to build up muscle tissue

Hello appetite control?

More protein!

Meat and eggs weren’t bad for you. They didn’t cause heart disease. You needed them to build up muscle tissue and to keep you from being hungry!

But in 1984, when this showed up on the cover of Time magazine (no relation to Timer the amorphous blob), I—along with a lot of other Americans—was forced to reconsider what I’d learned on those Saturday morning not that long ago:

My all-time favorite Timer PSA was this one:

When my get up and go has got up and went,

I hanker for a hunk of cheese.

When I’m dancing a hoedown

And my boots kinda slow down,

Or any time I’m weak in the knees . . .

I hanker for a hunk of

A slab or slice or chunk of–

A snack that is a winner

And yet won’t spoil my dinner–

I hanker for hunk of CHEESE!

In the 80s, when I took up my low-fat, vegetarian ways, I would still hanker for a hunk of cheese, but now I would look for low-fat, skim, or fat-free versions—or feel guilty about indulging in the full-fat versions that I still loved.

I’m no apologist for the food industry; such a dramatic change in our notions about “healthy food” clearly required some help from them, and they appear to have provided it in abundance.  And I’m not a fan of sugar-sweetened beverages or added sweeteners in general, but dumping the blame for our current health crisis primarily on caloric sweeteners is not only not supported by the data at hand, it frames the conversation in a way that works to the advantage of the food industry and gives our public health officials a “get out of jail free card”  for providing 35 years worth of lousy dietary guidance.

Next time on As the Calorie Churns, we’ll explore some of the interaction between consumers, industry, and public health nutrition recommendations. Stay tuned for the next episode, when you’ll get to hear Adele say: “Pollanomics: An approach to food economics that is sort of like the Field of Dreams—only with taco-flavored Doritos.”

As the Calories Churn (Episode 1): Nooooo, not the carbs!!!

Oh the drama!  Some of the current hyperventilating in the alternative nutrition community–sugar is toxic, insulin is evil, vegetable oils give you cancer, and running will kill you–has, much to my dismay, made the alternative nutrition community sound as shrill and crazed as the mainstream nutrition one.

When you have self-appointed nutrition experts food writers like Mark Bittman agreeing feverishly with a pediatric endocrinologist with years of clinical experience like Robert Lustig, we’ve crossed over into some weird nutrition Twilight Zone where fact, fantasy, and hype all swirl together in one giant twitter feed of incoherence meant, I think, to send us into a dark corner where we can do nothing but nibble on organic kale, mumble incoherently about inflammation and phytates, and await the zombie apocalypse.

No, carbohydrates are not evil—that’s right, not even sugar. If sugar were rat poison, one trip to the county fair in 4th grade would have killed me with a cotton candy overdose. Neither is insulin, now characterized as the serial killer of hormones (try explaining that to a person with type 1 diabetes).

But that doesn’t mean that 35 years of dietary advice to increase our grain and cereal consumption, while decreasing our fat and saturated fat consumption has been a good idea.

I have gotten rather tired of seeing this graph used as a central rationale for arguing that the changes in total carbohydrate intake over the past 30 years have not contributed to the rising rates of obesity.


The argument takes shapes on 2 fronts:

1) We ate 500 grams of carbohydrate per day in 1909 and 500 grams in 1997 and WE WEREN’T FAT IN 1909!

2) The other part of the argument is that the TYPE of carbohydrate has shifted over time. In 1909, we ate healthy, fiber-filled unrefined and unprocessed types of carbohydrates. Not like now.

Okay, let’s take closer look at that paper, shall we?  And then let’s look at what really matters:  the context.

The data used to make this graph are not consumption data, but food availability data. This is problematic in that it tells us how much of a nutrient was available in the food supply in any given year, but does not account for food waste, spoilage, and other losses. And in America, we currently waste a lot of food. 

According to the USDA, we currently lose over 1000 calories in our food supply–calories that don’t make it into our mouths.  Did we waste the same percentage of our food supply across the entire century? Truth is, we don’t know and we are not likely to find out—but I seriously doubt it. My mother and both my grandmothers—with memories of war and rationing fresh in their minds—would be no more likely to throw out anything remotely edible as they would be to do the Macarena. My mother has been known to put random bits of leftover food in soups, sloppy joes, and—famously—pancake batter. To this day, should your hand begin to move toward the compost bucket with a tablespoon of mashed potatoes scraped from the plate of a grandchild shedding cold virus like it was last week’s fashion, she will throw herself in front of the bucket and shriek, “NOOOOOO! Don’t throw that OUT! I’ll have that for lunch tomorrow.”

You know what this means folks: in 1909, we were likely eating MORE carbohydrate than we are today. (Or maybe in 1909, all those steelworkers pulling 12 hour days 7 days a week, just tossed out their sandwich crusts rather than eat them. It could happen.)

BUT–as with butts all over America including mine, it’s a really Big BUT: How do I explain the fact that Americans were eating GIANT STEAMING HEAPS OF CARBOHYDRATES back in 1909—and yet, and yet—they were NOT FAT!!??!!

Okay. Y’know. I’m up for this one. Not only is problematic to the point of absurdity to compare food availability data from the early 1900s to our current food system, life in general was a little different back then. At the turn of the century,

  • average life expectancy was around 50
  • the nation had 8,000 cars
  • and about 10 miles of paved roads.

In 1909, neither assembly lines nor the Titanic had happened yet.

The labor force looked a little different too:Labor force 1900 - 2000

Primary occupations made up the largest percentage of male workers (42%)—farmers, fisherman, miners, etc.—what we would now call manual laborers. Another 21% were “blue collar” jobs, craftsmen, machine operators, and laborers whose activities in those early days of the Industrial Revolution, before many things became mechanized, must have required a considerable amount of energy. And not only was the work hard, there was a lot of it. At the turn of the century, the average workweek was 59 hours, or close to 6 10-hour days. And it wasn’t just men working. As our country shifted from a rural agrarian economy to a more urban industrialized one, women and children worked both on the farms and in the factories.

This is what is called “context.”

In the past, nutrition epidemiologists have always considered caloric intake to be a surrogate marker for activity level. To quote Walter Willett himself:

“Indeed, in most instances total energy intake can be interpreted as a crude measure of physical activity . . . ” (in: Willett, Walter. Nutritional Epidemiology. Oxford University Press, 1998, p. 276).

It makes perfect sense that Americans would have a lot of carbohydrate and calories in their food supply in 1909. Carbohydrates have been—and still are—a cheap source of energy to fuel the working masses. But it makes little sense to compare the carbohydrate intake of the labor force of 1909 to the labor force of 1997, as in the graph at the beginning of this post (remember the beginning of this post?).

After decades of decline, carbohydrate availability experienced a little upturn from the mid 1960s to the late 1970s, when it began to climb rapidly. But generally speaking, carbohydrate intake was lower during that time than at any point previously.

I’m not crazy about food availability data, but to be consistent with the graph at the top of the page, here it is.

Data based on per capita quantities of food available for consumption:

1909 1975 Change
Total calories 3500 3100 -400
Carbohydrate calories 2008 1592 -416
Protein calories 404 372 -32
Total fat calories 1098 1260 +162
Saturated fat (grams) 52 47 -5
Mono- and polyunsaturated fat (grams) 540 738 +198
Fiber (grams) 29 20 -9

To me, it looks pretty much like it should with regard to context.  As our country went from pre- and early industrialized conditions to a fully-industrialized country of suburbs and station wagons, we were less active in 1970 than we were in 1909, so we consumed fewer calories. The calories we gave up were ones from the cheap sources of energy—carbohydrates—that would have been most readily available in the economy of a still-developing nation. Instead, we ate more fat.

We can’t separate out “added fats” from “naturally-present fats” from this data, but if we use saturated fat vs. mono- and polyunsaturated fats as proxies for animal fats vs. vegetable oils (yes, I know that animal fats have lots of mono- and polyunsaturated fats, but alas, such are the limitations of the dataset), then it looks like Americans were making use of the soybean oil that was beginning to be manufactured in abundance during the 1950s and 1960s and was making its way into our food supply.  (During this time, heart disease mortality was decreasing, an effect likely due more to warnings about the hazards of smoking, which began in earnest in 1964, than to dietary changes; although availability of unsaturated fats went up, that of saturated fats did not really go down.)

As for all those “healthy” carbohydrates that we were eating before we started getting fat? Using fiber as a proxy for level of “refinement” (as in the graph at the beginning of this post—remember the beginning of this post?), we seemed to be eating more refined carbohydrates in 1975 than in 1909—and yet, the obesity crisis was still yet a gleam in Walter Willett’s eyes.

While our lives in 1909 differed greatly from our current environment, our lives in the 1970s were not all that much different than they are now. I remember. As much as it pains me to confess this, I was there. I wore bell bottoms. I had a bike with a banana seat (used primarily for trips to the candy store to buy Pixie Straws). I did macramé. My parents had desk jobs, as did most adults I knew. No adult I knew “exercised” until we got new neighbors next door. I remember the first time our new next-door neighbor jogged around the block. My brothers and sister and I plastered our faces to the picture window in the living room to scream with excitement every time she ran by; it was no less bizarre than watching a bear ride a unicycle.

In 1970, more men had white-collar than blue-collar jobs; jobs that primarily consisted of manual labor had reached their nadir. Children were largely excluded from the labor force, and women, like men, had moved from farm and factory jobs to more white (or pink) collar work. The data on this is not great (in the 1970s, we hadn’t gotten that excited about exercise yet) but our best approximation is that about 35% of adults–one of whom was my neighbor–exercised regularly, with “regularly” defined as “20 minutes at least 3 days a week” of moderately intense exercise.  (Compare this definition, a total of 60 minutes a week, to the current recommendation, more than double that amount, of 150 minutes a week.)

Not too long ago, the 2000 Dietary Guidelines Advisory Committee (DGAC) recognized that environmental context—such as the difference between America in 1909 and America in 1970—might lead to or warrant dietary differences:

“There has been a long-standing belief among experts in nutrition that low-fat diets are most conducive to overall health. This belief is based on epidemiological evidence that countries in which very low fat diets are consumed have a relatively low prevalence of coronary heart disease, obesity, and some forms of cancer. For example, low rates of coronary heart disease have been observed in parts of the Far East where intakes of fat traditionally have been very low. However, populations in these countries tend to be rural, consume a limited variety of food, and have a high energy expenditure from manual labor. Therefore, the specific contribution of low-fat diets to low rates of chronic disease remains uncertain. Particularly germane is the question of whether a low-fat diet would benefit the American population, which is largely urban and sedentary and has a wide choice of foods.” [emphasis mine – although whether our population in 2000 was largely "sedentary" is arguable]

The 2000 DGAC goes on to say:

“The metabolic changes that accompany a marked reduction in fat intake could predispose to coronary heart disease and type 2 diabetes mellitus. For example, reducing the percentage of dietary fat to 20 percent of calories can induce a serum lipoprotein pattern called atherogenic dyslipidemia, which is characterized by elevated triglycerides, small-dense LDL, and low high-density lipoproteins (HDL). This lipoprotein pattern apparently predisposes to coronary heart disease. This blood lipid response to a high-carbohydrate diet was observed earlier and has been confirmed repeatedly. Consumption of high-carbohydrate diets also can produce an enhanced post-prandial response in glucose and insulin concentrations. In persons with insulin resistance, this response could predispose to type 2 diabetes mellitus.

The committee further held the concern that the previous priority given to a “low-fat intake” may lead people to believe that, as long as fat intake is low, the diet will be entirely healthful. This belief could engender an overconsumption of total calories in the form of carbohydrate, resulting in the adverse metabolic consequences of high carbohydrate diets. Further, the possibility that overconsumption of carbohydrate may contribute to obesity cannot be ignored. The committee noted reports that an increasing prevalence of obesity in the United States has corresponded roughly with an absolute increase in carbohydrate consumption.” [emphasis mine]

Hmmmm. Okay, folks, that was in 2000—THIRTEEN years ago. If the DGAC was concerned about increases in carbohydrate intake—absolute carbohydrate intake, not just sugars, but sugars and starches—13 years ago, how come nothing has changed in our federal nutrition policy since then?

I’m not going to blame you if your eyes glaze over during this next part, as I get down and geeky on you with some Dietary Guidelines backstory:

As with all versions of the Dietary Guidelines after 1980, the 2000 edition was based on a report submitted by the DGAC which indicated what changes should be made from the previous version of the Guidelines. And, as will all previous versions after 1980, the changes in the 2000 Dietary Guidelines were taken almost word-for-word from the suggestions given by the scientists on the DGAC, with few changes made by USDA or HHS staff. Although HHS and USDA took turns administrating the creation of the Guidelines, in 2000, no staff members from either agency were indicated as contributing to the writing of the final Guidelines.

But after those comments in 2000 about carbohydrates, things changed.

Beginning with the 2005 Dietary Guidelines, HHS and USDA staff members are in charge of writing the Guidelines, which are no longer considered to be a scientific document whose audience is the American public, but a policy document whose audience is nutrition educators, health professionals, and policymakers. Why and under whose direction this change took place is unknown.

The Dietary Guidelines process doesn’t have a lot of law holding it up. Most of what happens in regard to the Guidelines is a matter of bureaucracy, decision-making that takes place within USDA and HHS that is not handled by elected representatives but by government employees.

However, there is one mandate of importance: the National Nutrition Monitoring and Related Research Act of 1990, Public Law 445, 101st Cong., 2nd sess. (October 22, 1990), section 301. (P.L. 101-445) requires that “The information and guidelines contained in each report required under paragraph shall be based on the preponderance of the scientific and medical knowledge which is current at the time the report is prepared.”

The 2000 Dietary Guidelines were (at least theoretically) scientifically accurate because scientists were writing them. But beginning in 2005, the Dietary Guidelines document recognizes the contributions of an “Independent Scientific Review Panel who peer reviewed the recommendations of the document to ensure they were based on a preponderance of scientific evidence.” [To read the whole sordid story of the "Independent Scientific Review Panel," which appears to neither be "independent" nor to "peer-review" the Guidelines, check out Healthy Nation Coalition's Freedom of Information Act results.]  Long story short:  we don’t know who–if anyone–is making sure the Guidelines are based on a complete and current review of the science.

Did HHS and USDA not like the direction that it looked like the Guidelines were going to take–with all that crazy talk about too many carbohydrates – and therefore made sure the scientists on the DGAC were farther removed from the process of creating them?

Hmmmmm again.

Dr. Janet King, chairwoman of the 2005 DGAC had this to say, after her tenure creating the Guidelines was over: “Evidence has begun to accumulate suggesting that a lower intake of carbohydrate may be better for cardiovascular health.”

Dr. Joanne Slavin, a member of the 2010 DGAC had this to say, after her tenure creating the Guidelines was over: “I believe fat needs to go higher and carbs need to go down,” and “It is overall carbohydrate, not just sugar. Just to take sugar out is not going to have any impact on public health.”

It looks like, at least in 2005 and 2010, some well-respected scientists (respected well enough to make it onto the DGAC) thought that—in the context of our current environment—maybe our continuing advice to Americans to eat more carbohydrate and less fat wasn’t such a good idea.

I think it is at about this point that I begin to hear the wailing and gnashing of teeth of those who don’t think Americans ever followed this advice to begin with, because—goodness knows—if we had, we wouldn’t be so darn FAT!

So did Americans follow the advice handed out in those early dietary recommendations? Or did Solid Fats and Added Sugars (SoFAS—as the USDA/HHS like to call them—as in “get up offa yur SoFAS and work your fatty acids off”) made us the giant tubs of lard that we are just as the USDA/HHS says they did?

Stay tuned for the next episode of As the Calories Churn, when I attempt to settle those questions once and for all.  And you’ll hear a big yellow blob with stick legs named Timer say, “I hanker for a hunk of–a slab or slice or chunk of–I hanker for a hunk of cheese!”

Processed Meats Declared Too Dangerous For Human Consumption

Processed meats have been declared too dangerous for human consumption by pseudo-experts who are unable to differentiate between observational studies and clinical trials, thus posing tremendous risks to the collective IQ of the interwebz reading public [1].

The World Cancer Research Fund recently completed a detailed review of 7,000 studies covering links between diet and cancer. A grand total of 11 of these were actual clinical trials that tested two different dietary approaches or supplementation on cancer outcomes. Two of these 11 trials tested a dietary intervention, both using a low-fat diet versus a usual diet control. Researchers found that, “The low fat dietary pattern intervention did not reduce the risk of invasive colorectal cancer in any of its subsites” [2]. In other words, avoiding fat in foods like bacon, sausage, pork chops, and pepperoni will not reduce your risk of colon cancer; however, it may reduce your enjoyment of life considerably, and that, in itself, is a pain in the butt.

Upon conclusion, it is evident that reading research summaries written by people who don’t know the difference between an observational study and a clinical trial is dangerous for human intellect and the acquisition of accurate information. Consumers should stop reading processed articles full of information pollution and should instead watch re-runs of Gilligan’s Island. 

What are processed meats?
Processed meats include bacon, sausage, hot dogs, sandwich meat, packaged ham, pepperoni, salami and nearly all meat found in prepared frozen meals. Processed meats are usually manufactured with an ingredient known as sodium nitrate, which is often linked to cancer by pseudo-experts who don’t know how to look up stuff in PubMed. Sodium nitrate is primarily used as a colour fixer by meat companies to make the packaged meats look bright red and fresh. Monosodium glutamate is also added on a regular basis to enhance the savoury flavour. An extra letter “u” added to words can also enhance colour and savoury flavour.

Sodium Nitrate has been strongly linked to the formation of cancer-causing nitrasamines [sic] in the human body, leading to a sharp increase in the risk of cancer for those consuming them. This is especially frightening, since as far as actual science goes, there is no such thing as a nitrasamine. Scientists are very concerned, however, about nitrosamines, which do, in fact, actually exist. Their concern reflects a growing body of evidence that people writing about nutrition on the internet actually have no idea about which they are ostensibly talking:

“There has been widespread discussion about health risks related to the amount of nitrate in our diet. When dietary nitrate enters saliva it is rapidly reduced to nitrite in the mouth by mechanisms discussed above. Saliva containing large amounts of nitrite is acidified in the normal stomach to enhance generation of N-nitrosamines, which are powerful carcinogens in the experimental setting. More recently, it has been suggested that nitric oxide in the stomach could also be carcinogenic. A great number of studies have been performed examining the relationship between nitrate intake and gastric cancer in humans and animals. In general it has been found that there is either no relationship or an inverse relationship, such that a high nitrate intake is associated with a lower rate of cancer. Recently, studies have been performed suggesting that not only is nitrate harmless but in fact it may even be beneficial. Indeed, acidified nitrite may be an important part of gastric host defense against swallowed pathogens. The results presented here further support the interpretation that dietary nitrate is gastroprotective. They also suggest that the oral microflora, instead of being potentially harmful, is living in a true symbiotic relationship with its host. The host provides nitrate, which is an important nutrient for many anaerobic bacteria. In return, the bacteria help the host by generating the substrate (nitrite) necessary for generation of nitric oxide in the stomach” [3].

A 2005 Hawaii University study found that reading articles about processed meats written by ninnies who can’t spell “nitrosamine” increased the risk of a 5-point IQ reduction by 67%, whilst another study found that it increased the risk of twerking by 50%. These are scary numbers for those consuming articles about processed meats on a regular basis.

Monosodium glutamate (MSG) is a second dangerous-sounding chemical found in virtually all processed meat products. MSG is thought by people who are unable to navigate PubMed to be a dangerous excitotoxin linked to neurological disorders such as migraine headaches, Alzheimer’s disease, loss of appetite control, obesity and unrestrained blogging. Nutrition bloggers use MSG to add a deceptively scientifical-sounding level of paranoia to their articles about the addictive savory flavor of dead-tasting processed meat products. This will deflect unwary readers’ attention away from inane and poorly-worded concepts such as “addictive savory flavor of dead-tasting processed meat products.” On the other hand, the Joint FAO/WHO Expert Committee on Food Additives, the Scientific Committee for Food of the European Commission, the Federation of American Societies for Experimental Biology, and the Food and Drug Administration all concluded that, although there may be a subpopulation of people sensitive to its effects, no health risk have been found to be associated with MSG [4]. But what do they know?

Food items to check carefully for aliveness before piling them into your cart:

  • Beef jerky
  • Bacon
  • Sausage
  • Pepperoni
  • Hot dogs
  • Sandwich meat
  • Deli slices
  • Ham

…and many more meat products

If it’s so dangerous to consume such stupidity, why are they allowed to write it?

Unfortunately nowadays, access to operational brain cells is not a prerequisite for access to a keyboard and a WordPress account. That and First Amendment concerns have allowed unsuspecting readers curious about the real health effects of some food components to be misled, confused, and frightened by the insidious repetition of poorly-researched half-truths written by bloggers with a frail grasp on reality and an affinity for really big words that they don’t quite know the meaning of, like nitrso , um, nitarsa, um, nirstirammidngieaygyieg.

Unfortunately, these bloggers seem to hold tremendous influence over the blogosphere, and as a result consumers have little protection from dangerous propaganda intentionally added to internet, even in places that aren’t Reddit.

To avoid the dangers of idiot bloggers writing about processed meats:

  • Always read primary sources for yourself. If there are no primary sources, leave a pleasantly snarky comment to that effect on the blog site and never go there again.
  • Don’t read any articles about sodium nitrate or MSG from bloggers who don’t know how to spell “nitrosamine.”
  • Avoid eating red meats served by restaurants, schools, hospitals, hotels or other institutions without asking for it to be served thick and juicy, just the way you like it. This will give you the courage and moral fortitude to look up stuff yourself on PubMed, without having to rely on bloggers who don’t know how to spell “nitrosamine.”
  • If you are fixated on fresh something, be fixated on Fresh Prince.
  • Avoid processed blog material as much as possible
  • Spread the word and tell others about the dangers of reading idiot blogs about the dangers of sodium nitrate and MSG

Vitamin C naturally found in lime juice that has been gently squeezed into a tumbler of tequila has been shown to help prevent the formation of permanent facepalms after accidently ingesting an idiot nutrition blog and can help protect you from the devastating IQ-lowering effects of blobbers who cant spll. The best defense of course is to avoid the interwebz all together and go dancewalking.


Sources:

  1. http://hollyleehealth.com/2013/04/02/processed-meats-declared-too-dangerous-for-human-consumption/
  2. http://www.wcrf.org/PDFs/Colorectal-cancer-CUP-report-2010.pdf
  3. http://www.jci.org/articles/view/19019