Nutrition and Society: How Scientific Illiteracy is Affecting a Nation’s Choices


Information accessibility today is so far advanced from where we were even ten or twenty years ago. Anyone who can read has all kinds of information right at their fingertips, ranging from medical journals to popular science articles to celebrity news articles, so you’d think that our society would be able to arm themselves with the facts for how to make healthy choices in life. Unfortunately, for many people, science seems completely inaccessible; even though they could technically find anything they’re looking for, many people lack the ability understand the material, or to sort through what is true and what is not.


The US Office of Disease Prevention and Health Promotion defines health literacy as “the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions.”

According to the this same entity, the ODPHP, only around 12% of Americans are considered proficient in health literacy. This means that the other 88% of the nation can be given all of the information they need to be able to make the healthy choices in life and avoid many preventable illnesses, but they may not fully understand what this information means to them or how to apply it. This creates an enormous barrier for healthcare professionals, particularly those of us working in public health.


US Adults’ Health Literacy Level: 2003



Pseudoscience is a term that describes beliefs or principles that are mistakenly believed to be based on scientific method. Unfortunately, the internet is inundated with information that is widely regarded as being factual evidence that supports the beliefs of many people, when in reality, it is nothing more than pseudoscience. This has posed a problem for the scientific community and has contributed to such enormous public health problems such as the anti-vaccination movement. 


But is it gluten-free?

The field of nutrition has not escaped the waves of popular pseudoscience; many people have read testimonials about drinking your coffee with MCT oils in it to lose weight and “stay energized, without the midday crash!” and become immediately convinced that they need to drink their coffee with a stick of butter.  The gluten-free diet is another of these phenomena that has huge populations of people converted to the gluten-free lifestyle. But, as Jimmy Kimmel found out, some people who follow a gluten-free diet don’t even know what gluten is, let alone whether or not they should actually be eating it.


To further confuse the issue, the difference between a nutritionist and a dietitian is not well understood among the general population. In the U.S., the title”nutritionist” is not federally regulated. It’s up to the states to decide whether they want to regulate the title of nutritionist, while “registered dietitian” is; one has to earn the title of RD with a B.S. from an accredited institution, then complete an accredited internship, and then pass your RD exams. So in some states, anyone can legally call themselves a nutritionist. This further complicates the plethora of information that people have to sort through to get actual, science-based health information.

Nutritionist Doctor

“I’m totally a nutritionist..”


Unfortunately for uniformed consumers, misinformation can be a big selling point. People can become convinced that they need to buy all kinds of things to be living the most healthy possible life. All natural foods! Dietary supplements! DHA added! Many industries take advantage of people who may not have a scientific or healthcare background. If you’re at all unsure of what you should or should not be adding to or removing from your diet, try to arm yourself with the facts (actual facts, not alternative ones).

Huge corporations have a history of hijacking our popular beliefs or mob mentality: some of you may be old enough to remember when it was still legal to smoke in restaurants and public buildings, even planes. Tobacco corporations were aggressively marketing at the general public, who was being led to believe that it was harmless. Thanks to the valiant efforts of our nation’s public health heroes, smoking among adults went from 42.4% in 1965 to 24.7% in 1997. In my first blog post, I discussed the incredible difference public health can make when teamed with other entities for a common goal, and this considerable decrease in tobacco use during the late 20th century is no exception.

Today, many giants in the food and nutritional supplement industry are hopping on that train once again; prebiotics can be found in granola bars and juice, the words “all natural” are plastered on everything from shampoo to laundry detergent, all in the hopes that people who think this sounds good will but it!


Until we make some serious changes in public health and our educational system, scientific and health illiteracy will continue to plague our society. Many determined public healthcare professionals are crusading to help people become advocates for their own health, and although this is an uphill battle, it can be done.

The most simple pieces of advice I can give to those who are uncertain about their healthcare or nutritional choices are these:

  1. Find a doctor for general health issues, and a registered dietitian that you trust to go to with your nutrition concerns. Don’t be afraid to get a second opinion!
  2. If you’re into doing your own research, skip the documentaries and find peer-reviewed nutrition or medical journals.
  3. Always check an article’s sources; Wikipedia is not a source.


While the medical field is always expanding and recommendations can change, it’s important to stay informed in this age of information. Being informed and educated by reliable sources can help you from being confused about which information is real and which isn’t. Remember that any regular person with a computer can write an article and make it seem trustworthy, so don’t believe everything you see on the internet.




The Effects of Our Cultural Food Appropriations


world_1630_mural_lg Food fads are a very interesting subject that I addressed in my post about the Bacon Renaissance. Some food fads die out over time, but some are here for good. One interesting pattern we can follow through history is the phenomenon of one culture’s foods being appropriated by another.

In examining these food-based epidemics of history, we can see that they’re not without their benefits and many enjoy complete sustainability. Tea is an excellent example of a product that once took the world by storm, enrapturing entire nations and integrating permanently into thousands of societies. Today, tea is appreciated worldwide and almost entirely sustained from just several regions of the planet.

Tea is a success story, but not all foods can make their way to global fame and acceptance in such a sustainable manner. Today’s topics will be quinoa, a South American “super food”, and Bluefin Tuna, a large, predatory fish native to several of the world’s oceans.

Bolivia and The Great Quinoa Boom

Quinoa is nutrient dense pseudo-cereal (actually a seed) that has been cultivated in the Andean regions of South America for thousands of years. Riding the great wave of health foods through popular culture, it has made its way onto shelves of grocery store across the nations. It’s hip, its versatile, but most of all, it’s packed with nutrients. Quinoa is a complete protein, meaning it contains all nine essential amino acids. It is packed with potassium, protein, fiber, magnesium, and vitamin B-6, with very low levels of cholesterol and fat. But there’s a catch.

As previously mentioned, quinoa has been growing in the Andes for centuries. The Peru and Bolivia has produced around 92% of the world’s quinoa, much of this coming from the Altiplano region of Bolivia.

While this boom in demand seems like a blessing to the poor Bolivian farmers, certain socioeconomic and agricultural concerns have come to the attention of researchers, scientists and of course, Bolivians.

The Bolivian altiplano is made primarily up of small farms called minifundios. Most of these farms have been serving the families and communities living on them for year. They have not been accustomed to gross production. New technologies and methods must be introduced, which can disrupt the social organization and delicate ecosystem of this region.

Many of the farmers have also been practicing traditional farming methods using manual plowing, animal husbandry, crop rotations and fallow to keep the homeostasis of the soil nutrients in proper balance for farming. Naturally, potatoes and llamas are being edged out in favor of the more profitable quinoa, now being exported in bulk to nations like the US and Canada.

The quinoa boom is not all bad, don’t get me wrong. Quinoa is being investigated for its potential to address global food security in climate change. Bolivia is a poor nation that can use the economic influx, but it’s important to know that many of the producers are not raising crops in a sustainable way. This is true of many agricultural endeavors across the world, but many people don’t realize how much their purchasing power is really  worth.

Overfishing for Tuna

Another current trend in food appropriations we’re witnessing is that of the Bluefin Tuna and its popularity among the world’s sushi eaters. As the number of sushi restaurants increases, so seems to decrease the stocks of Bluefin Tuna. Sushi lovers cherish this fish for its fatty underbelly, or toro. The Scientific Committee for Tuna and Tuna-like Species in the North Pacific Ocean of 2014 estimates that Bluefin Tuna stock has suffered a 96% from unfished levels.

So what’s the point I’m trying to make? 

The socioeconomic and environmental impact of globalization is huge. As we continue to thrive as a species and find new ways to utilize technology and our finite and renewable natural resources, the need for adaptation is critical. In order to preserve earth’s assets, the acknowledgement and establishment of a symbiotic relationship with our planet and its other inhabitants will be the responsibility of the current and future generations.

If the past is any indicator, we can take it in confidence that we humans require a great deal of persuading when it comes to capital gains versus environmental conservation. Both the sushi industry and the health food industry, with quinoa’s success at the forefront, are enormously successful enterprises, so it will probably take the world some convincing to realize what we’re doing with our indiscriminate importation and consumption of foods we cannot locally produce.

So then, the responsibility starts with us, the consumers. Consumerism drives our world today. If people aren’t buying it, no one will sell it. That means if we’re not supporting companies that are environmentally irresponsible, there is no place for them in our economy. Personally, I’ve stopped buying quinoa. When I first learned about it, I bought a giant bag at Costco every month, but after learning about its procurement, I stopped. There are a lot more local, sustainable things I can have as staples in my diet that come at a much lower global cost. Like lentils! Eat lentils, guys. But that’s a story for another post…

Addiction and Epidemiology




Consider the epidemic. Most of us think of SARS, smallpox or whooping cough-things that are completely out of our control and capable of debilitating or destroying entire populations. But what about such things as alcoholism, obesity or eating disorders? These issues are often behavior-related, and many people will argue that these issues are completely a matter or personal choice. For the purpose of this paper, I will focus on epidemiology in terms of society and one of the behavior-based plagues of our time: addiction.

In light of the recent death of well-known and respected actor, Philip Seymour Hoffman, I think it’s safe to say that we can see addiction as an issue that does not discriminate. It does not select individuals solely from poverty-stricken homes or broken families. Addicts and alcoholics are found across the social spectrum, from all walks of life.

Although there has been much debate about whether or not addiction can be considered a disease, it is without a doubt an enormous public health concern. According to recent studies by the National Institute on Alcohol Abuse and and Alcohol Addiction, around 85,000 alcohol-implicated deaths occur every year. That’s more deaths caused by alcohol than by diabetes. Even more astonishing, it is estimated that Americans consume 80% of the prescription pain killers in the world. Although some of these drugs are consumed under the direction of a doctor, many of them are not, and are contributing to the millions of Americans suffering through addiction. These numbers are quite clearly of epidemic proportions.


You may wonder what this topic has to do with nutrition.  While the problem of addiction and substance abuse prevention is a battle on an entirely other front, nutritionists have been faced with different stage of this issue: recovery.

More and more, doctors and scientists are finding the beneficial effects of introducing proper nutrition as a component in the treatment of addiction.  For the fortunate addict or alcoholic who makes it out of the throws of this affliction, access to an educated nutrition professional may prove to be invaluable in their road to recovery.

When someone has been repeatedly exposing their body to excessive toxins, naturally a period of healing often follows, that can sometimes take up to 2 years. Cessation and recovery from addiction involves many bodily upsets, including changes in metabolism, organ function and mental health. Addiction and alcoholism have also been shown to result in hypoglycemia with surprising prevalence.

Picture in your mind an active alcoholic or drug addict. Most of us think of someone who is emaciated and quite unhealthy-looking. The  stages of this illness range in severity, but most of the folks addicted to substances don’t put their nutrition at the top of their list of priorities. Many patients checking into rehab centers have gone months or years without proper nutritional health, and are quite out of touch with the most optimal fuels to be feeding their bodies.


Many treatment centers for addiction are now incorporating a nutritional facet to their programs, some with in-house dietitians or community nutritionists to plan meal programs or work with the patients directly. It has become abundantly clear that while the “Just Say No” drug campaign did little to combat the actual use and distribution of drugs in our society, there is much to be done in the recovery front, and members of the health, scientific and mental health communities will likely all play a role, nutritionist professionals are certainly already being seen playing a vital role in the treatment of this modern epidemic.

Tomatoes and the Adventures of Congressional Vegetable Rulings


What’s in a Name?

What is a vegetable, really? While the true meaning of the word has been argued for some time now, and culinary professionals define their foods a little differently than botanists or even customs officials, we all know the tomato has long been a candidate up for semantic debate.

 In 1893, the U.S. Supreme Court ruled in a customs case that the tomato is considered a vegetable, despite the knowledgeable conclusion of numerous botanists that a tomato (as well as corn, cucumber and bell pepper) is actually a fruit. This all stems (get it?!)  from the scientific taxonomy of biological classification, but my point is that it has definitively been decided by the USDA how many servings of vegetables Americans should be getting a day, and what these vegetables are is also, apparently, up the United States Government. 

But First, a Little History

The National School Lunch Program was signed by President Truman and established in 1946. The program came into effect because it became evident that malnourished children grew up to be very poor soldiers; this federally imposed meal program was imposed as a “matter of national security”.

school lunch

In 1966 the program went even further by passing the Child Nutrition Act to provide breakfast, milk, and special equipment for the school kitchens. The success experienced as a result of the National School Lunch Program inspired officials as they began to recognize the role of proper nutrition for brain development in our nations’s children.

However, this crusade seemed to lose momentum in the early 1980’s when the Federal School Lunch Program withstood a staggering  budget cut of 25%. This led to a number of decreases in quality and quantity, including substitutions and portion reductions. The USDA’s Food and Nutrition Service under the Reagan Administration actually encouraged states to explore such audacious options such as pickle relish as vegetable substitutions. Not surprisingly, this campaign, and Reagan himself were publicly smeared with the critique that condiments are not vegetables.


So… Can Pizza Actually Be a Vegetable?

This brings me to more current regulations. Many people probably remember the public outrage in 2011 when the agricultural spending bill passed that failed to honor the attempts to enhance the nutritional content of school lunches.  The proposed changes included several things like limiting sodium and potato use on the lunch lines, and not counting 1/8th of a cup of tomato paste as a 1/2 a cup of vegetables. In other words, in order to get their full cup of veggies, kids will have to consume however much cheese and bread comes along with that tomato paste in the sauce. This lead to slew of headlines claiming that “Congress Declared Pizza a Vegetable”. 

Perhaps unfortunately for the general public, policies that are made in the name of nutrition are not always done so solely with the objective consideration and input of trained professionals. Things like cost and subsidies come into account when making these decisions, too. Dairy, meat and salt industries have a say in how much of these things we “should” actually eat. For instance, knowledge about the dangers of excessive sodium consumption have been abound for 30 or more years, but it’s been difficult to get the reduction many health professionals have been asking for, as salt is an efficient and cost-effective way to make food taste good, so the salt industry actually has a powerful sway. 


Back to the Tomato and Our School Lunches

Fresh fruits and veggies are pretty easily recognized by most anyone over the age of 4. However, some regulations and public policies make it a little more ambiguous as to how to count your servings. In light of these facts, the position of public policy for someone passionately interested in good food seems daunting and disheartening.

While there are community nutrition positions in government, it seems that fierce politicians rarely graduate from the nation’s dietetics and nutrition programs to go on a fight for nutritional legislature. Many big industries and companies seek out professionals who will lobby for their product, without giving an objective look at the nutritional value of their goods. However, without the nutritional advocacy of educated, unbiased professionals, our children would be very much at the mercy of budget cuts and the food industries, who quite frankly, don’t always have the nutritional content of our school lunches in mind. Without them, the future’s children may be scraping their “vegetables” from the inside of a salty can…

My gratitude goes out to those who fight for our nutritional standards, with our most healthy best interests at heart.

The Bacon Renaissance: You Are What Your Friends Eat



Bacon Makes Everything Better..?

Over the past few years, bacon has made its way off of the breakfast table and into some completely unrelated areas of our lives. You’ll find it on T-shirts, bumper stickers, iPhone cases and other various, inedible novelty items. Additionally, it has appeared across the spectrum of our meals. Coffee shops boast bacon-flavored lattes, while menus of upscale restaurants feature chocolate-covered bacon, and I think we all know of the famed Maple Bacon Doughnut from Voodoo Doughnuts that even spawned a beer of the same flavor. So, is bacon the fad anti-diet?


Wikipedia defines diet faddism as, “idiosyncratic diets and eating patterns that promote short-term weight loss, usually with no concern for long-term weight maintenance, and enjoy temporary popularity”. Every generation sees some new and some of the same fad diets. Things like the Paleo Diet, Atkins Diet, and the Blood Type Diet have been around for several generations, but how much do these affect us?

Take into consideration the gluten-free craze. Although most people now know that these products began by offering options to those with Celiac Disease or wheat/gluten sensitivities, but it took the nation by storm. Hundreds of well known brands are now producing gluten-free items for the estimated 1% of the American population afflicted with Celiac Disease. This is just one of many things that has become a health trend for millions of Americans who want to live more healthful lives. It can be pretty easy to just jump on the bandwagon.. everyone is doing it.

You Are What Your Friends Eat

I moved to Bozeman three years ago, and although I’ve always been interested in nutritional health, I found myself among a lot more seasoned connoisseurs of healthful eating. This is mecca for vegan, granola, and local-foods-only consumers. It’s trendy to be healthy. I found myself eating a lot more foods from the organic section, and dining at the Co Op with my friends on a twice-a-week basis. When I leave town, I sometimes find myself indignant at the lack of variety in  restaurants and grocery stores. Whether or not I like to admit it, I’ve become one of them.

Most food trends have proven to be no different than most fashion trends in their transitory natures. They sweep through the population, appearing especially in certain demographics and peer groups. Celebrities are talking about them, restaurants are featuring them, and eventually, most will go the way of the mullet. The great fact of all these trends is this: they affect the way we eat. If you’re one of the folks that concedes to the notion that bacon makes everything better, then you’re likely eating it with more than just your eggs.

As a target population, while Americans have added 2 years to their life expectancy, they have not improved their status in the weight bracket in the last 10 years. In fact, the issue of obesity has only gotten worse. Interestingly, there has been some speculation  that weight gain can be “contagious“, with rising numbers making obesity, in a sense, socially transmittable. In other words, if your friends are overweight, you’re also more likely to fall into this category.

Who Controls the Food Trends?

While doing my research, I came across countless recipes, blogs and products catering to bacon lovers, from edible dishes made of bacon, to a Wikipedia page entitled, “Bacon Mania“. I had dinner at a downtown restaurant last week where the cornbread came with a whipped butter that was peppered with candied bacon. I also came across some interesting pages about annual food trends. Once again, I was reminded of the fashion industry, and the people who decide what’s going to be in style this year. It’s probably not the people who care about your wallet or self image. Similarly, the yearly food trends don’t likely conform to the views of the people who are looking out for the health of our nation. After all, we are living under the watchful eye of an industry that invites us all to choose the country’s new favorite flavor of potato chip.

Ultimately , these kinds of unhealthy trends are a nightmare for the community nutritionist. Bacon is certainly a good source of flavor, but it’s also a great source of saturated fats, preservatives and sodium. With heart disease being the leading cause of death in the United States, can our society really afford to endorse such foods?


Don’t get me wrong, this is certainly not an anti-bacon campaign. I have a fondness for this porky treat, myself. My point in all this bacon and fad diet talk is not just that social intake norms affect actual individual intake, but that many people make their choices with a lack of knowledge, following a sort of mob mentality, simply because it’s socially acceptable.

The community nutritionist’s job doesn’t just stop at forming policies or improving school lunches; they also educate their constituents about the food choices they’re making. Considering the notion that public health professionals aim to create an environment where people can thrive in a healthful manner, I think the experts could take into consideration the need to dismantle certain misconceptions about fad diets and trending foods, so that we can all make educated choices with how we’re eating. Because ultimately, it’s my choice whether I decide to put bacon bits on my frozen yogurt, not my community nutritionist’s.

A Brief History of Salt and Public Health



As budding health professionals, many of us know that iodine is a trace mineral essential for proper thyroid function. This function, however, was not public knowledge until fairly recently.

Iodine deficiency can result in cretinism, mental retardation, and goiter. In the early 1900’s, populations living in inland areas became especially susceptible to goiter, as they had little access to iodine-rich foods from the oceans. These parts of the United States became known as The Goiter Belt, and spanned over most of the north and central states. As these occurrences of goiter became more prevalent, scientists began linking iodine with the treatment, and soon the prevention, of this illness.

Armed with this new knowledge, doctors and scientists set about  finding a vehicle for the efficient delivery of this necessary element to the masses. In cooperation with public health services, salt companies began adding iodine to their product and offering it at the same price as regular salt to Americans in the 1920’s. This availability lead to a steep decline in cases of endemic goiter.

This innovative collaboration of public health, medicine, science, and industry was an excellent example of the positive power of public health. Not only did goiter begin to diminish, but Americans noticed an increase in IQ and decreased numbers of babies being born with mental handicaps. The target populations were given an accessible preventative treatment to some very undesirable conditions at no extra cost. The entire operation of salt ionization enjoyed worldwide success in areas with these health disparities, granted they had the available funding and development.

While all of these things were counted as successes to many national health systems in the 20th century, some drastic changes began to appear more and more in the way Americans ate. The arrival of processed and convenience foods was a revolutionary breakthrough in the preservation and availability of goods. This mid-century innovation employs many techniques to produce products, but many foods are processed with sodium. As more knowledge about the health ramifications of excess sodium intake surfaced, Americans have been encouraged to cut back on their sodium intake. Since many of the foods we are already eating contained sodium, many people simply cut back on the iodized table salt. People have also begun to embrace a culture of artisan health. Many folks prefer to use sea salt, or any of its various fancy cousins, e.g. Himalayan pink salt and Fleur de Sel.  This all sounds like fair compensation, but the problem is this: the sodium used in processed foods does not contain iodine, and sea salts contain some trace minerals, including iodine, but not nearly enough to fulfill daily requirements.

artisan-salt-types jpg

Fortunately, with all of the advancements in food systems today, iodized salt is not the only resource for people living inland and hoping to maintain healthy thyroid function. The importance of educating people on proper iodine intake, however, may now become more essential in coming years. Successful health systems must remain current with population trends in nutrition, and ensure that the people know what they may be lacking. Hundreds of Americans have relied on the safety net of fortified foods from our public health systems to retain proper nutrient levels, and these kinds of food trends that stray from tradition are often completely unpredictable. This is where a knowledge of generational diversity is important, as many people in younger generations have begun to make different choices and foods, some of these fortifications may become obsolete.

While I have faith that we will likely avoid the appearance of a modern Goiter Belt,  I am interested to see what kinds of things we will see on the nutritional horizon in the future of public health and food trends.