The Evolution of High Quality Drinking Water in the United States
Probably the most spectacular water event in 2014, a year of drought and controversy over fracking, was the chemical leaking into West Virginia’s Elk River of ten thousand gallons of 4-Methylcyclohexane Methanol (MCHM), a chemical used to clean coal.
This Charleston incident served as the starting point of an excellent article on “The Politics of Drinking Water” by Anya Groner. Groner’s article takes a look at the history of America’s drinking water laws and customs. We usually think of advances in drinking water purity to start with chlorination. We forget about steps like the evolutionary jump from shared public drinking cups to the “bubbler” and very successful strategies like moving the water uptake point away from the human pollution near the lakeshore to a point far out in the lake to prevent water-borne diseases.
Here are some excerpts from Anna Groner’s article:
Most Americans take cheap, safe drinking water for granted. Globally, one out of 10 people can’t access clean water. Some 1,400 children die each day from water-related diseases. Unless there’s a spill or equipment failure, these numbers exclude U.S. residents. Across the 50 states, 155,000 public water systems treat, filter, and deliver 100 gallons per person per day, all for the low cost of less than 1 cent per gallon.
1911 Drinking Fountain
Contaminant-free drinking water hasn’t always been part of the American experience. Until the early 1900s, shared public cups accompanied most drinking fountains. Cholera, typhoid fever, dysentery, and food poisoning from coliform bacteria—all potentially fatal—spread from mouth to cup and back again. Diarrhea was rampant. Not until 1899, when Kohler Water Works invented the Bubbler, which pumped a continuous flow of water an inch into the air, did a spout replace the cup. To partake, drinkers stooped over the copper basin and slurped. What wasn’t sucked up dripped down the nozzle. Clean water mingled with saliva. Though an improvement over the public cup, bacteria still flourished.
Humans weren’t the only creatures to suffer waterborne illness. In the late 19th century, 100,000 horses populated New York City’s streets, producing 26,000 gallons of urine daily. Concerned with dehydration, early chapters of the American Society for the Prevention of Cruelty to Animals advocated for the erection of “fountains for man and beast,” with large, street-side basins for horses, sidewalk basins for “the sons of men,” and low spouts for dogs. Glanders, an equine disease now eradicated in North America, proliferated. Lesions formed in the infected horses’ respiratory tracts, causing fevers; coughing; and, ultimately, septicemia (an inflammation of the blood). Within days of exposure, horses died. On occasion, the bacterium crossed species’ lines, taking the lives of cats, dogs, goats, and men.
Despite health hazards, drinking fountains became a fashionable social project. Prominent citizens appealed to city governments to build fountains “for the convenience of street passengers,” and the growing temperance movement boosted the cause. In 1859, a doctor named A. K. Gardner warned the Common Council of New York City that, “Men, and women, too… resort to drinking saloons and bar-rooms where they must ‘take a little something’ for the sake of a glass of water.” A New York Times editorial from the same year argued, “intemperance should be arrested… by putting fresh, good water freely within the reach of the wayfarer.” Water and sewerage boards, church temperance clubs, men’s associations, and tree planting societies took up the cause by writing letters, holding meetings, and raising money.
The ensuing fountains ranged from purely functional to “handsome bronze and marble affair[s]” designed more to flaunt wealth and memorialize family names than to quench public thirst. Rich patrons bequeathed fountains in their wills, and young people collected change to support upkeep. Newspapers supported this fetishization, printing the locales of new fountains alongside lists of prestigious attendees at inaugural festivities.
In 1892, when the Chicago World’s Fair coincided with a devastating typhoid outbreak, clean water became a matter of national safety. In the two years prior, Chicago suffered more typhoid-related deaths than any other city in the world. To protect the fair’s 27 million guests from infection, engineers designed plumbing that extended four miles into Lake Michigan where they hoped the water was contagion-free. Additional supplies were piped in from Waukesha, Wisconsin, and sold for a penny per glass. The innovations worked. When the fair opened to the public in 1893, infection rates dropped and the outbreak receded.
By 1900, germ theory—the belief that microscopic pathogens travel through air and water—took hold. New sanitation methods promised to eliminate these invisible threats. Redesigned Bubblers included arc projection, separating clean water from run-off, and the first disinfectant, a continuous dilute solution of chloride of lime, was added to the Boonton Reservoir in 1908, providing sterile, disease-free water to Jersey City. Nationwide, municipal treatment centers followed suit. Though gastroenteritis and norovirus infections occasionally broke out, germ-free water became the norm.
As tap water became safer, drinking fountains provided a staging ground for white Americans to act out fears of racial contamination. The rhetoric of sanitation—maintaining purity against an insidious threat—was used to justify Jim Crow laws. From 1876-1965, alongside hospitals, trains, lunch counters, voting booths, and highway passing lanes, drinking fountains became sites of Black exclusion. “White Only,” “Colored Only,” or simply “Colored” signs directed traffic. A 1963 pro-segregation speech titled “The Message from Mississippi” argued that separate fountains protected white citizens from “exposure” to bad morals, poor education, and improper hygiene: “There are many Negroes, of course, who have reached plateaus of citizenship. They are personally clean, have high morals and are educated. However, they are still in the minority.” In 1964, the Civil Rights Act mandated “equal enjoyment … of public accommodation,” ending segregated fountains and setting precedent for the 1990 Americans with Disabilities Act, which legislated spout height and knee clearance to enable wheelchair access.
Although public water fountains have become more inclusive, they’ve also grown less desirable. Bottled water, the fastest-growing drink product in the U.S., is now the preferred way to hydrate. The anthropologist Martha Kaplan suggests that this “bottlemania” reflects post-9/11 skepticism of federally-protected water supplies. Participants in her study of American water consumption cited unclean pipes, pollution, unsavory smells, bad tastes, and fluoridation as reasons for preferring the corporate-produced, single-serve water bottle. In the Great Recession, Kaplan notes, “Bottled water [was] the only luxury people [could] still afford.”
Besides portability, bottled water offers few advantages over the fountain. Many popular brands—including Aquafina and Dasani—simply fill bottles with tap water. The difference in taste, when there is a difference, is most often caused by the disinfection process. Public treatment plants use chlorine while bottled water companies tend to adopt more costly methods: ultra violet light or ozonation. Not only is single-serve bottled water more expensive than gasoline—averaging $7.50 a gallon—the petroleum used to create the plastic of the bottle and the carbon released during its shipment incur environmental costs. Student organizations such as “Tap That” at Vassar College and “Take Back The Tap” at the University of Nevada attempt to reduce plastic bottle consumption. So far, over ninety colleges have restricted bottled water sales. Last March, San Francisco became the first city to create policy on the topic by banning distribution of single-serve, single-use bottled water on public properties.
Bottled water backlash has renewed enthusiasm for old-fashioned drinking fountains. Since 2013, the EPA has partnered with mayors to “reinvigorat[e] our nation’s supply” of these “iconic symbols of public health and welfare in our communities.” Companies have taken note. Both Elkay EZ and Halsey Taylor sell affordable retrofits: no-touch, sensor-activated spigots that turn neglected fountains into “HydroBoost” stations where passersby can top off reusable bottles. While consumers pause for their refill, electronic counters track how many plastic bottles they’ve diverted from landfills. Watching the display uptick feels good, akin to the sensation produced by a Facebook like or a favorited tweet.
Unlike oil, water is a renewable resource, replenished by rain and snowmelt. Even so, environmentalists warn that we’re tapping out our supply. Agriculture, industry, and household use deplete ecosystems faster than they can replenish. Many of the world’s biggest rivers—including the Indus, the Ganges, and the Colorado—often dry to sand before reaching the ocean. The Baltic Sea, central Lake Erie, the lower Mississippi River, and portions of the Gulf of Mexico are so polluted by fertilizers and sewage that they’ve become oxygen-deprived and are unable to support life.
As we near peak water, hydroclimatologist Peter Gleick warns that skirmishes over resources will intensify. “Water can be—and often is—a source of cooperation rather than conflict,” Gleick notes, “but conflicts over water are real.” Already Gleick’s organization, the Pacific Institute, has created a 5000-year timeline of water-related conflict. Highlights include Assyrians poisoning enemy wells with rye ergot in the 6th century B.C., the World War II targeting and destruction of Soviet hydroelectric dams, the U.S. bombing of North Vietnamese irrigation canals in the 1960s, and riots in Cape Town, South Africa in 2012 sparked by insufficient water supplies. By 2025, scientists predict that one in five humans will live in regions suffering from water scarcity, areas with insufficient resources to meet water usage demands.
You can read Anya Groner’s full article in The Atlantic.