Weirdest Food Panics In History

These days when you talk about a "food panic," also known as a "food scare," it's likely that people will presume you're referring to the public's response to an actual or feared food shortage in connection with the COVID-19 pandemic (via Dayton Daily News). But historically, these terms have referred more often to "a threat to food safety" that has escalated into consumers responding by altering their behavior (via 2017 paper published in the British Food Journal). These sorts of food panics have become increasingly frequent over the last century and a half. 

One reason is the increasingly complicated multi-party food supply chain, as a result of which there are simply more opportunities for legitimate dangers to be introduced even as it becomes more difficult to trace a given danger to its source, according to the paper authors. Another reason is that the concept of "food panics" includes not only scenarios where the danger is real and confirmed, but also scenarios where the danger is entirely fabricated; however, members of the public can be powerless to discern which is which (via Time). 

Read on for the weirdest food panics in history. But do bear in mind that while each of the food panics we discuss here actually happened — which is to say that from a historical perspective, consumers believed there was legitimate cause for concern — not all were based on an actual danger. In some cases, in fact, the only thing to fear was fear itself. 

The swill milk scandal

Prior to the 1850s, when you drank a glass of milk, there was a good chance you knew which farm it came from, if not the name of the cow who produced it (via Smithsonian Magazine). As the Industrial Age dawned, that began to change, with dairy products being among the first mass-produced, mass-distributed consumables (via Homestead.org). And there you have the perfect storm for the "Swill Milk" scandal of the mid-nineteenth century. 

Just as consumers were learning to place their trust in commercial dairy operations, commercial dairy operators were likewise learning just how much they could get away with in the name of maximizing profits. First order of business? Swap out the more costly but healthy grain the cows had been eating for bottom-of-the-barrel "swill," which is the technical name for grain that has been already been used as distillery mash. Although the cows eventually perished from their toxic diet, while they were alive, they produced "sickly, bluish milk," which would then be adulterated with things like "chalk, eggs, flour, water, [and] molasses" to make it appear legit, according to Smithsonian.

It was most certainly not legit, however. Swill milk sickened consumers and may have been responsible for killing as many as 8,000 children in New York. It was also being produced and distributed in other states as well, per Smithsonian. Eventually, newspapers caught on, and so did the public, leading to the passage of federal anti-swill milk legislation in 1862. 

Terror in a can

Canned food had been around for more than a century before Americans started bringing it into their homes around the time of the Civil War, according to Smithsonian Magazine. Even then, they were skeptical both of what it might offer in terms of flavor and texture and what it might afflict upon those who partook. By the turn of the twentieth century, Americans had begun to welcome, albeit still somewhat uneasily, canned food into their kitchens. Then in 1919 and early 1920, it became apparent their concerns were justified when canned black olives from California gave botulism to people throughout Ohio, Michigan, and New York, killing 18. 

The mainstream media caught wind of this development and publicized it widely, which created a panic in both consumers and in canned industry stakeholders, who feared the episode would damage the nation's already shaky goodwill toward canned foods. However, both groups benefited because the industry's resulting PR campaign was centered on research, education, and activism to nip botulism in the bud going forward. Today botulism from commercially canned food is virtually unheard of. Perhaps ironically, what is more common is botulism from home-canned food (via food safety specialist, Dr. Ben Chapman). 

The corned beef scare of 1964

Typhoid fever, a life-threatening but treatable (with antibiotics) bacterial infection, is often spread via contaminated food, according to History of Vaccines. In the spring of 1964, an outbreak of typhoid fever hit the city of Aberdeen, Scotland. Before it was done, over 500 residents had been forced into quarantine, and three were dead, via The National. Within a week after the outbreak began, Scotland's Medical Officer of Health announced the outbreak had been traced to a single commercial-sized tin of corned beef that had been used by a local shop selling sandwiches and sliced meat, according to the authors of a 2007 paper published in the Journal of Epidemiology and Public Health

Although it turned out the bacteria had spread to non-corned beef eaters as well, via the meat-slicing machine, corned beef alone became the focus of this food panic. At first, there were rumors that the corned beef had come from a government stockpile long past its expiration date. As it turned out, the contaminated corned beef had come from a South American (Argentinian) meat exporter (via The National). As a result, this called into question the hygiene practices of all South American meat exporters. However, government officials were reluctant to pressure South American meat suppliers to step up their hygiene — for fear of trade and political consequences. And that couldn't have done any favors for corned beef's reputation. Bottom line? Throughout 1984, corned beef consumption declined markedly (it eventually recovered).

The panic over mad cow disease

On March 24, 1996, UPI reported that fast food giant McDonald's had banned British beef in its burgers "amid growing panic about a possible link between [British] beef and a human strain of so-called mad cow disease." A quarter-century later, it is now well-understood that humans cannot catch mad cow disease. However, people who eat the flesh of infected cows can develop what's known as an "acquired" form of Creutzfeldt-Jakob disease (CJD) (via FDA). CJD is an incurable neurodegenerative illness that destroys parts of the brain and nervous system, resulting in dementia, seizures, and eventually death (via Time). 

CJD, which was first identified in 1920, typically affects people over the age of 50 (via Centers for Disease Control and Prevention (CDC)). However starting in the late 1980s, the U.K. began seeing cases of much younger people being diagnosed with CJD, and scientists familiar with mad cow disease suspected these cases represented the transmission of mad cow disease to humans (via Center for Food Safety). It turned out they were right, and a panic quickly spread throughout the world. The silver lining is that members of the red meat supply chain around the world responded by developing practices to reduce the risk of infection in cattle and to prevent sick cows from infecting others in their herd (via Time). As a result, while mad cow disease continues to crop up now and then (via CDC) — transmission of CJD to humans is extremely infrequent.

The unjust war against red M&Ms

If you were a kid growing up in the 1970s, you probably remember the first time you tore open a pack of M&Ms only to discover, to your horror, the red M&Ms were gone. Just like that. Nowhere to be found. So what happened that led the Mars company to forsake this popular M&M color?  Why, a food panic, of course. What happened was a 1971 study out of the Soviet Union linked the food coloring, "Red Dye Number 2" or "Red No. 2," to cancer, according to Live Science. Red No. 2 had been in legal use in the U.S. since 1906, which is when the U.S. government first began regulating food dyes, and up until 1971, Red No. 2 had been used in lots of foods, including hot dogs and ice cream. But you know what it wasn't used in? M&Ms. 

Nevertheless, "public outcry in the U.S. against the dye quickly gained such fervor that the Mars candy company temporarily stopped producing red M&Ms despite the fact that they had never contained Red No. 2 in the first place" (via Live Science). Although red M&Ms made a brief return, they were taken out of production once again in 1976 after the U.S. FDA's ruled that "in high doses, Red No. 2 could cause cancer in female rats."

Red M&Ms have been back since 1987, still free of Red No. 2, just like they always were.

This gum was so YUM, it caused a panic

Before Bubble Yum made its debut, there was no such thing as soft bubble gum. Clearly, however, there was a market for it, because when it launched, it was such a huge success that its manufacturer, Life Savers, was forced to cut back on marketing "to allow production to keep pace with demand" (via Snopes). Then something went terribly wrong, at least in the New York area. Rumors had begun to circulate that Bubble Yum contained spider's eggs. Or spider legs. Or spider webs. It's not entirely clear how these rumors got started, but they weren't entirely unpredictable, according to Snopes, which noted "any confection that revolutionary is going to spawn speculation among the younger set." And so compelling were these rumors, not to mention terrifying to children and perhaps even some gullible parents ... that sales of Bubble Yum proceeded to tank. 

It got so bad, according to Snopes, that Life Savers spent over $100,000 on a damage control-focused marketing campaign. "Last week, the manufacturer, Life Savers, Inc., took out full‐page ads in 30 area newspapers to combat the rumors," the New York Times reported on March 29, 1977. Ultimately, Life Savers' efforts to quell the noise proved successful. Either that or kids liked Bubble Yum so much, they decided it was worth the risk of growing baby spiders in their tummies. 

The fears surrounding that fizzy candy

"Pop Rocks are small pieces of hard candy that have been gasified with carbon dioxide under superatmospheric pressure," according to the Pop Rocks website. "When these gasified sugar granules come in contact with moisture, in someone's mouth or in water, milk, soft drinks, etc., the candy dissolves and the gas retained inside the carbon dioxide bubbles is released, causing characteristic crackling and fizzing sounds." Pop Rocks literally exploded into the American candy market in 1975, creating so much buzz that, well, naturally, rumors and backlash were inevitable, according to SnopesOne of those rumors was that if you washed down your Pop Rocks with soda, your stomach would explode. Another was that this actually happened to a child actor, John Gilchrist ("Little Mikey" from the Life Cereal commercials), and that he was dead as a result.

Pop Rocks was battling "exploded kid" rumors as early as 1979, according to Snopes, taking out full page ads in 45 publications and writing 50,000 letters to school principals, and conducting a national tour to explain that "Pop Rocks generate less gas than half a can of soda and ingesting them could induce nothing worse in the human body than a hearty, non-life-threatening belch." So panicked were American parents, the FDA felt compelled to set-up a hotline to assure them. 

Pop Rocks are still sold to this day, and no one has ever exploded from eating them. Including John Gilchrist, who is now in his 50s (via Thrillist).

Prohibition-related panic, compliments of the U.S. government

During Prohibition, which began in 1920, the U.S. government decided to literally manufacture its own food panic — in the hopes of discouraging illegal drinking. They did so by ordering the poisoning of all industrial alcohol manufactured in the U.S., according to Deborah Blum, author of The Poisoner's Handbook: Murder and the Birth of Forensic Medicine in Jazz Age New York (via Slate). That being said, poisoning non-potable alcohol was not a new thing at the time. Prior to Prohibition and as early as 1906, American manufacturers of industrial alcohol were already engaged in the practice of "denaturing" alcohol intended for industrial use, which refers to adding toxic chemicals to make industrial alcohol actually poisonous (via Snopes). Nor had its purpose ever been to cause harm. Rather, the denaturing of industrial alcohol was intended to draw a clear distinction between drinkable and non-drinkable alcohols — for tax and tariff purposes.

In 1926, after several years of battling bootleggers who had taken to stealing industrial alcohol to make moonshine, the U.S. government decided to order the denaturing of all industrial alcohol made in the U.S. Their hope was to deter bootleggers, terrify would-be moonshine purchasers, and make a dent in organized crime, and while certainly some would be drinkers may have steered clear of moonshine as a result, at least 10,000 people died from drinking poisoned moonshine by the time Prohibition ended in 1933.

Let us now remember the lettuce panic of 2018

Leafy greens, such as spinach, are the second most common source of foodborne E. coli infections, according to a 2020 research paper published in the journal, Emerging Infectious Diseases, which counted at least 40 outbreaks, 1,212 illnesses, and eight deaths between 2009 and 2018 in the U.S. and Canada. But no leafy green has ever caused as much panic as romaine lettuce, which the authors of the 2020 research paper linked to 54 percent of all such outbreaks. And, apparently, no year has ever been worse for romaine lettuce than 2018, a year marred by two massive romaine lettuce-related E. coli outbreaks (via NBC News).

"No one should eat romaine lettuce — or any lettuce at all — unless they can be sure it's not from Arizona, federal health officials said Friday," NBC News wrote in April of 2018 in connection with an outbreak traced to Yuma, Arizona. "Throw out your lettuce from the Salinas Valley," the Tampa Bay Times wrote in November of that same year in connection with another outbreak. "Shoppers are probably going to avoid romaine lettuce and maybe other leafy greens like spinach just ahead of Thanksgiving due to fears over yet another expanding multistate outbreak of E. coli," Food Safety News wrote on November 23, 2019.  Unfortunately, climate change may have other plans. 

The horse meat scandal of 2013

Remember those claims that McDonald's used "pink slime" to make its McNuggets? It wasn't true. Or those recent claims about Subway tuna salad containing no tuna? That one may or may not be true. But one food panic related to unconscionable food fraud that did happen was the one involving horse meat turning up in beef in Ireland, according to a 2013 article published in QJM: An International Journal of Medicine. "The Food Safety Authority of Ireland tested a range of cheap frozen beefburgers and ready-made meals from supermarkets last November for the presence of DNA from other species which were undeclared. It found horse DNA in over one-third of the beefburger samples, and pig in 85% of them," the Guardian wrote at the time.

As the New York Times reported, news of this massive and cringeworthy fraud stirred "furor" in Ireland and the U.K., ultimately leading to the establishment of the National Food Crime Unit in the U.K., according to a 2017 paper published in the journal, Science of Food. Nor did Americans take the news well. "Every week, it seems, another restaurant, supermarket chain, or Swedish furniture maker announces that instead of feeding its customers beef, they — whoops! — accidentally served horse meat," wrote Time in 2013, noting that Americans "react with revulsion at the very thought." Unfortunately, horse meat has been known to find its way into U.S. ground beef as well, according to a 2016 study published in the journal, Food Control.

The unpasteurized juice panic

"Some outbreaks are wakeup calls for consumers," wrote Food Safety News, regarding the E. coli outbreak stemming from unpasteurized apple juice in 1996. After more than 65 people who drank Odwalla unpasteurized apple juice were sickened, leaving a sixteen-month-old child dead, panic ensued. Where did the bacteria come from? The apples? Their processing? Who was to blame? As it turned out, a 1998 lawsuit saw Odwalla indicted and held criminally liable on 16 federal charges and forced to pay $1.5 million in reparations. Odwalla also took a reputational beating. Sales of Odwalla juices virtually disappeared — plummeting by 90 percent, according to an article by Axia Public Relations, which noted that Odwalla ultimately recovered thanks to the Herculean efforts of its public relations team (which focused the public on "the company's admirable efforts in providing the public with all the necessary facts" and "measures taken to prevent future outbreaks"). 

In that sense, the Odwalla panic became a win for the company and for public relations professionals in general. A much more important win, however, is that hippie-dippy Odwalla felt compelled to start pasteurizing its juices, according to Food Safety News. In addition, the U.S. government imposed a requirement on all juice manufacturers to affix a warning label to any unpasteurized juice containers. Odwalla was later acquired by Coca-Cola, which continued pasteurizing Odwalla juices until 2020 when Coca-Cola discontinued the line, according to CNNwhose coverage of the divestiture did not mention the E. coli panic of 1996.