Salt’s Fat Chance

Salt. iStockphoto.

Salt, a precious commodity since ancient history and the most abundant non-metallic mineral in the world, has recently come under attack from public-health practitioners. iStockphoto.

Americans have become familiar with a recurring script: A killer lurks in our food supply, frightening in its ubiquity and toxicity. Experts agree that it is scientifically proven to be dangerous. If we rout this scourge, thousands of lives will be saved!

Over the years various substances have played the villain in this script, including saturated fats, high-fructose corn syrup, alar, red food dye, and trans fats. The newest member of the club is salt. Common table salt—sodium chloride—is facing fresh scrutiny from public-health practitioners, policy makers, and activist groups for its possible contribution to high blood pressure and stroke. In 2008 New York City’s then–Health Commissioner Thomas Frieden met with food manufacturers to request voluntary reductions in the salt content of foods. Frieden, now head of the Centers for Disease Control and Prevention, told the New York Times in 2009, “If there’s not progress in a few years, we’ll have to consider other options, like legislation.” Across the pond the British Food Standards Agency set salt-reduction targets for 2012 that its spokesperson admits are “challenging.” Companies like Unilever and Campbell Soup Company have announced comprehensive salt-reduction plans, while firms like Nabisco and PepsiCo have strategically reduced salt in some brands.

A population-wide reduction in salt intake could turn out to be a positive development for public health. But the intertwined stories of saturated fat and trans fat offer a warning for the future of salt: dangerous “improvements” can turn out to be the most insidious nutritional villains.

Previous generations of nutritional scientists and chefs from around the world celebrated fatty foods for their ample calories, vitamins, micronutrients, and essential fatty acids. But after World War II, with fewer people suffering from nutritional deficiencies or dying of infectious diseases, American scientists and nutrition reformers began to study the role of dietary fats in heart disease and cancer. The lipid hypothesis, which suggests saturated fat in the diet produces serum cholesterol in the blood that leads to heart disease, was the subject of considerable controversy in the 1960s, 1970s, and 1980s. For example, the 1977 report Dietary Goals for the United States, prepared by the Senate Select Committee on Nutrition and Human Needs, was arguably the first non-wartime government document to recommend Americans eat less of anything—namely cholesterol and saturated fat. Meat, egg, and dairy producers were not pleased.

Soon thereafter, three National Academies of Science commissions struggled with the uncertain relationship between dietary fat and disease when preparing reports on diet and health in the early 1980s. By 1984 an American Heart Association commission concluded, “Although it is not definitively established that the advocated alterations in diet will actually reduce the incidence of [coronary heart disease] . . . it is imprudent to wait indefinitely for proofs of efficacy in the face of the high incidence of coronary heart disease.”