Previously: Solutionism, part 1. Part of The Techno-Humanist Manifesto.
Progress is messy. Its benefits come with inextricable costs and risks: pollution, accidents, radiation, carcinogens, rogue AI.
Solutionism acknowledges, even embraces, this fact. It means fully accepting these problems as reality, and enthusiastically stepping up to meet the challenge. It means investigating at the first hint of a problem, not waiting until it becomes too glaring to ignore. It means measuring, diagnosing, and monitoring problems; then preventing, mitigating or curing them.
This isn’t a matter of slowing progress down, or making a tradeoff between health and safety on the one hand and progress on the other. There is no such tradeoff, because health and safety are a part of progress.
Solutionism is not techno-solutionism. Techno-solutionism is the belief that all problems have a technological solution. This is wrong: some solutions come from law, education, or culture; and often a combination of all of the above. Road safety, for instance, has been improved not only through inventions such as seat belts, air bags, and anti-lock brakes, or even systems improvements such as traffic lights and divided highways, but also through driver’s education, legal licensing of drivers, and moral campaigns against drunk driving.
But in the history of progress as told today, the technical work of health and safety has gone mostly unsung. This sets up a misleading narrative, in which problems are created by technology and then solved by activism and law, and this in turn sets up a needless and counterproductive rivalry between technologists and social reformers. To counteract this, let’s highlight some stories of the heroic technical work that has gone into creating the relative health and safety of the modern world.
Underwriters Laboratories
Fire is a constant threat through all of recorded history. Almost every major city has suffered a “great fire” that raged out of control for days, burning large swaths of buildings to the ground. One reason is that until recently the world was full of open flames—candles, oil lamps, fireplaces, stoves—that could easily ignite a conflagration.
The advent of electricity was thus hailed as an improvement not only in convenience and cleanliness, but in safety. But paradoxically, electricity at first increased rather than decreased fire risk. At first, fire insurers weren’t sure what was causing the risk; they just knew they were suffering major losses in buildings thought safe. Some suggested a wave of arson. But by the early 1890s it was discovered that the blame lay with faulty electrical products and installations, such as exposed wiring and unsafe heating elements.1
To the fire insurance industry, this was “a dangerous flank attack from an unexpected quarter.”2 But one insurance executive saw the possibility of a solution. He pointed out that in the 1850s, when petroleum was new, that was the cause of some of the greatest fires. But:
gradually we got a little knowledge, until at last, we felt that we understood very well the properties of petroleum in its various forms and products, and could prescribe rules to make the use of it safe. ... I do not see why we should not do the same with our knowledge of electricity. Let us go as far as we know now. ... By and by, after we shall have had experience enough, and after the scientists shall have continued their experiments and research long enough, we shall be able to formulate a set of rules, a compliance with which makes the use of electricity as safe as the use of kerosene or petroleum in any form is to-day.3
He was right. The solution to electrical fire risk lay in standards, testing, inspection, and certification. And it was largely created by a technical organization, the Underwriters’ Laboratories (UL), sponsored by the fire insurance companies.4 UL became and still is today the standard for safety of electrical appliances. Look around your home or office, and it won’t take long to find a product marked with the UL label, or that of its European counterpart, CE.
UL tested electrical devices for mechanical strength and durability, proper insulation, risk of overheating and electrical arcing, and overall suitability for use.5 They verified that automatic safety mechanisms worked as advertised, such as an electrical stove with a shut-off at a certain temperature.6 For wire, they measured the thickness of the copper, the rubber insulation, and the cotton covering; they bent the wire back and forth and stretched it to test for elasticity; they performed chemical tests on the rubber; they immersed the wire in water to test if the current leaked.7 They tested products not only for use but for abuse, as in the hands of “the careless or unskilled people who form so large an element of the public,” such as “the placing of small objects in the orifice of the machine by mischievous boys.”8
UL was a check against cheap manufacturers who cut corners in order to reduce price. One electrical toaster, for instance, was going to be sold for ten cents (a few dollars in today’s money). When UL examined it, they found “exposed and unprotected heating elements,” supply wires that could come in contact with the metal frame, and “insulation not designed for electric heaters”:
With the toaster placed on a plain uncovered pine board, with a square of sheet asbestos where a slice of bread would be placed, flames enveloped the toaster in six minutes.9
And UL took on the testing of many other products, from fire hose to sprinkler systems to building materials. Their most impressive piece of equipment was a “gigantic combination of furnace and hydraulic ram” to test the strength of columns in a fire, within which “there has been created a heat equal in intensity to the most terrific conflagration, and, simultaneously, the ram has exerted a downward thrust equal to the weight of many stories. … Some of the blackened and distorted columns that have been submitted to this ordeal are to be seen in the courtyard. Their appearance removes any doubt as to the thoroughness of the test.”10 “Formerly,” notes one history, “the conflagrations themselves were the sole laboratories for severe tests.”11
UL didn’t simply trust manufacturers to bring them samples: they also tested goods purchased on the open market.12 They inspected hundreds of factories, issuing grades and demerits.13 For these purposes, they had dozens of field offices throughout the US, Canada, and UK.14 The UL label became trusted and valued enough that purchasers were willing to pay significantly more for it, which incentivized manufacturers to pay for the testing.15
By 1923, 30 years after its origin, UL had tested over 35,000 types of electrical appliances,16 and in that year their list of currently approved products included 5,465 types of electrical goods, 1,585 types of building materials, 793 types of fire-fighting equipment, and 2,410 “devices related to hazardous substances.”17 At that point UL was labeling over half a billion items a year.18
Today, our electronics and appliances are so safe that arson is the cause of more fires than either of them (arson is third, after cooking and heating equipment).19 The UL name is virtually unknown today, relegated to a tiny insignia on the underside of your desk lamp or coffee maker. But UL, and the researchers who developed its testing procedures and standards, deserve a place in progress history.
Toxicology
Even technology to create health and safety carries its own risks.
Medicine before the 20th century, despite its best intentions and its Hippocratic Oath, was often more likely to harm than help. Some remedies were simply ineffective, such as literal snake oil. Others were outright harmful, such as bloodletting. Still others were slightly effective but with terrible side effects. Mercury, for instance, was used to treat syphilis; in the effort to kill the disease, it nearly killed the patient.20 The burgeoning market for “patent medicines” in the late 1800s was rife with shams and fraud, “cures” that were at best harmless and at worst poisonous.21
By the early 1900s, actually-effective drugs were developed, particularly antibiotics. But even if a drug is effective against germs, the active ingredient or its preparation can be toxic to humans. Gradually, painfully, we learned that new drugs needed to be tested.
But drug testing was in a primitive stage. Through much of the 1800s, there was more concern about whether food and drugs were adulterated with unwanted substances than about whether the ingredients themselves were hazardous. Some of the earliest toxicity testing, starting in 1902, was done on food additives. By modern standards, it was simultaneously reckless and inadequate. A group of twelve young male volunteers, nicknamed the “Poison Squad,” were fed diets containing high levels of chemical preservatives such as benzoic acid, borax, and formaldehyde.22 Their weight, temperature, and pulse were recorded; their blood and excreta were examined; and they were asked to report on symptoms; still, this method could only produce a rough range of the dose at which harmful effects might occur, and a vague description of symptoms.23 Other tests around the same time were similarly “quite primitive in both design and execution.”24 At the time, the need for vitamins had not yet been discovered, and researchers could not always distinguish between toxicity and nutritional deficiency in test subjects.
The 1938 amendments to the US Food and Drug Act required, for the first time, that pharmaceutical companies test their drugs for safety before selling them. But a law requiring tests does not provide the knowledge of how to do those tests and does not in itself ensure effective testing. There are dozens of methodological questions. How many test subjects should be used, and of what species? How long should the trials be run for? What effects should researchers look for? What aspects of the experiment must be carefully controlled?
The answers to these questions were worked out over the course of two decades by pharmacologists, chemists and pathologists at the FDA. By 1959, they had published a set of guidelines, officially titled “Appraisal of the Safety of Chemicals in Foods, Drugs and Cosmetics,”25 but soon informally known as the “FDA Bible.”26
In testing for toxicity, the guidelines advised as a standard of comparison the dose that affects 50% of a group of test animals, since this is the most reproducible dosage. 27When the effect in question is lethality, this is known as the LD50. It advised that at least three species should be tested, including at least one nonrodent, to measure species variability.28 Sometimes long-term studies failed from not using enough animals; “a minimum of 15 to 20 rats of each sex and no less than six dogs should be employed per dosage group.”29
How then to translate the LD50 in test animals to humans? If the LD50 in all test species are the same order of magnitude (proportional to body weight), then a similar lethal dose can be assumed for humans. Otherwise, it can be estimated based on phylogenetic, biochemical or metabolic similarity between humans and the test species. And if we don’t even have that to go on, the safest thing is to assume the lowest LD50 across test species.30 The report advised that the route of administration, physical state of the drug, and solvents used can all affect the results and must be controlled.31 Detailed advice was given on how to do exploratory trials to find the right initial dose, and even how to plot a dose-response curve.32
But measuring the lethal dose is not enough: “it is essential to note the type, time of onset, severity and duration of all toxic signs or symptoms,” such as gasping, vomiting, or even “abnormal posture.”33 It is also essential to observe the animals for a long enough period: at least two weeks; if the onset of symptoms is delayed for several days, then at least four weeks. If a drug will be taken in multiple doses over time, up to a month, the study should be at least three months; drugs used on a regular basis should be studied for six months in dogs and a year in rats; over-the-counter drugs require a longer study period than prescription drugs.34 In cases where a new drug is chemically similar to one already on the market, researchers could be tempted to take a shortcut and assume their effects are similar too—but the guidelines warned that even minor changes to chemicals produce major effects, so all new compounds should be treated as completely new drugs.35
These may seem like tedious formalities, but the details of study methodology are crucial. Nature is devious and subtle; to reveal her secrets, the key must fit precisely in the lock. Even minor flaws in procedure could cause critical effects to be missed or safe dose levels to be misestimated.
The researchers who did this work—one history calls out in particular Drs. Arnold J. Lehman, Arthur A. Nelson, and O. Garth Fitzhugh for leading the effort—deserve to not be forgotten. Lehman now has an annual toxicology award named after him;36 Fitzhugh rated an obituary in the Washington Post (which misspelled his name).37 This seems to me small tribute for establishing some of the foundations of food and drug safety.
The catalytic converter
Burning fuel has fouled the air since long before the Industrial Revolution. In England, there are complaints about coal smoke recorded as early as 1257, and royal attempts to ban coal burning starting in 1307.38 In the 1600s, John Evelyn wrote that London suffered from so much coal smoke that “if there be a resemblance of Hell upon Earth, it is in this volcano in a foggy day: This pestilent smoke, which corrodes the very iron, and spoils all the moveables, leaving a soot upon all things that it lights; and so fatally seizing on the lungs of the inhabitants, that the cough and the consumption spare no man.”39 When smoke mixed with fog, the pollution it created deserved a new word: “smog.”
By the 20th century, refined oil and natural gas overtook wood and coal as fuel sources.40 These fuels burn cleaner than coal, so they don’t create smoke, but they still pollute the air.
If oil and gas burned completely, and if they were all that burned, then they would produce only carbon dioxide and water vapor. But no chemical process is 100% efficient, and so the exhaust from oil and gas combustion contains some unburned or partially burned hydrocarbons, such as benzene, ethylene, and formaldehyde, some of which are known as volatile organic compounds (VOCs) because they evaporate easily.41 Also, the oxygen for combustion comes from air, but air has four times as much nitrogen as oxygen, and so even though nitrogen gas is less reactive, some of it ends up burning as well, creating nitrogen oxides (NOx).42 Exhaust also contains soot, dust, and other particulates. Finally, when they reach the atmosphere, the VOCs and the NOx combine via UV radiation from sunlight to create ground-level ozone, another pollutant.43 Many of these pollutants block or scatter light, and NO2 in particular has a reddish-brown color, so together they create a brownish haze. Although it contains neither smoke nor fog, we still call this “smog.”44 And it’s particularly bad in places that have lots of fuel emissions, lots of sunlight, and the right geography and weather patterns to trap the smog in a basin and keep it from dispersing. Places like, for instance, Los Angeles in the mid-1900s.
The smog problem, and awareness of it, grew over the decades. As early as 1923, an article in JAMA warned about the health hazards of auto exhaust.45 By the 1930s, smog was present in the Southern California basin.46 In 1943, the LA Times reported on a “daylight dimout” that caused weeping, sneezing, coughing, and smarting eyes. Residents began to complain to the government, calling the smog “intolerable,” “death-dealing poison,” a “daily horror,” and a “mild hell.”47 Later research would show that smog can impair lung development in children48 and also degrade pigments, rubber, and vegetation.49
Starting in the 1940s, the city and county of LA established multiple commissions, bureaus and departments to deal with air pollution. Before long, the problem was attracting serious attention from scientists and engineers.50 One of them gave a talk in 1954, “World’s Biggest Cleaning Job,”51 that warned about the danger of air pollution, and cited cars as the biggest offender.
The speaker, Eugene Houdry, was not an environmental radical (there were few of those in the 1950s). In fact, in his early career he had developed chemical processes for oil refining. He had helped create the very problem he was warning about. And he planned to clean up pollution from fuels using the very technology that he had previously used to make them: catalysis.
It had been observed in the early 1800s that certain chemicals would cause other chemicals to react without themselves being consumed in the reaction.52 These substances were called catalysts, and Houdry had used them to improve the cracking process in oil refining (in which large, heavy hydrocarbons are broken down into the smaller, more useful ones that make up fuels such as gasoline).53 Now he wanted to use them to break down the key pollutants in vehicle exhaust, as well as in industrial applications and in the home. In fact, his speech was just the introduction to a demonstration. He had started Oxy-Catalyst, Inc. to create air-purifying technology based on catalysts, and he showed off one of their products. It was a small box, no more than six inches in any dimension, containing dozens of ceramic rods between two end plates. The rods were coated with platinum and aluminum, the catalysts, which oxidized combustible gases into harmless CO2 and H2O. The tone Houdry struck was not apologetic, but solutionist: “as an engineer,” he said, “I am confident that we can get rid of this menace if we want to.”54
By the 1960s, we wanted to. Clean air laws passed both in California and at the federal level required drastic emission reductions, sending the auto companies into an intense program of research on the best ways to meet the requirements.55
The simplest measures, and the first implemented, were engine modifications. About 20% of hydrocarbon (HC) emissions leak out of the crankcase, so a system was designed to capture those and vent them back into the engine intake. Another 20% come from the fuel tank and carburetor; these were captured by a charcoal filter and also put back into the intake. But the majority of HC, and all the CO and NOx, came out the tailpipe. Spark timing was adjusted to reduce emissions (although with a penalty to fuel economy). Carburetor air was preheated to evaporate unburned fuel droplets during cold start. Exhaust gases were recirculated into the engine to lower peak combustion temperature, which lowered NOx. All of it helped, but it was not enough to meet the stringent requirements slated by law for the 1975 model year. Something more radical was needed.56
Much effort went into researching fundamentally different power trains, such as electric and steam. GM demonstrated these at a 1969 exhibit called the Progress of Power, including two steam models. Although the steam cars were cleaner and quieter, they were also large, heavy, and expensive. The two models demonstrated got fuel efficiencies of less than about 10 mpg, and took minutes to warm up. They were retired after this demonstration.57
The catalytic converter was ultimately the solution to smog. It had no moving parts, required no power, and did not interfere with the engine.58 But developing it for practical use required solving several technical challenges.
The right catalysts had to be found to break down all of the key pollutants. Many materials were tested. Some were not active enough. Some lost their effectiveness rapidly with use. Some were vulnerable to exhaust contaminants like sulfur dioxide. Some became oxidized themselves and evaporated, or even formed carcinogens that had the potential to flake off. It had been hoped that some base metals such as copper, iron or nickel would work, but in the end only rare metals passed all the tests. Iridium was not available enough, but platinum, palladium, and rhodium were selected.59 In order to supply the rare minerals, about a ton of rock had to be mined for each car made; 10,000 miners were added to the workforce of one South African mining company in order to satisfy the contract for GM alone.60
Since rare metals are expensive, they had to be used sparingly. That meant depositing a small amount of them on the surface of an underlying structure, in order to maximize their exposure to exhaust gases. One design used a bed of ceramic pellets, which could be tricky to work with. In one early test, the pellets got sucked back into the engine, ruining it.61 In another design, with a retaining grid, the pellets got stuck in the holes, blocking airflow.62 These problems were fixed and the pellet design was deployed in production vehicles, but ultimately it was replaced with a new invention: a monolithic design with a “honeycomb” pattern of small channels. This required an advanced manufacturing process, but was more compact and had better airflow.63
Converters also had to withstand the heat of the exhaust pipe and the vibration of the road over the decades and many thousands of miles of use; one retrospective remarks that “the automobile catalyst has resulted in the development of materials durable under extreme environments previously thought impossible in conventional catalytic processes.”64 Still, excessive heat would quickly degrade the performance of the converter, and engine operation had to be tuned to avoid this.65
By 1975, catalytic converters were standard equipment on new cars.66 But these early models only lowered emissions of HC and CO, not NOx. The challenge was that the chemical reactions required were fundamentally different: HC and CO levels are lowered by oxidizing reactions; lowering NOx needed a reducing reaction.67 The former works best in the high-oxygen environment provided by a lean air/fuel mixture; the latter in the low-oxygen environment provided by a rich mixture.
To achieve both kinds of reactions and remove all three key pollutants, the air/fuel ratio had to be maintained exactly in the sweet spot where both reducing and oxidizing catalysts operated at high efficiency. Achieving this required an electronic control system with an oxygen sensor in the exhaust stream that fed back into fuel metering. This was the first instance of computer control in automobiles, and helped drive the evolution from the mechanical carburetor to the computer-controlled fuel injector.68
These “three-way” converters were deployed starting in 1981. By the early 1980s, emissions of NOx had been reduced 76%, and HC and CO reduced 96%, from the levels of the 1960s.69 Ozone peaked in the late 1960s and has been declining ever since, even as population and vehicle usage continues to rise. One environmental researcher concludes, “this megacity has gone from being one of the most polluted in the world 50 years ago to presently one of the ‘least polluted’ cities of its size,” saving many thousands of lives.70
You could go your whole life without ever thinking of the catalytic converter, but we should be grateful for it with literally every breath we take.
Again, health and safety are a part of progress. Fewer fires, safer drugs, and smog-free skies are progress.
This is true even when, as in these cases, the problems were caused by progress in the first place. Inventing electrical power is progress, and solving the fire risk it creates is further progress. Inventing the automobile is progress, and solving the smog it creates is further progress. As David Deutsch has pointed out, all triumphs are temporary: all solutions create new problems, just as in science, all answers raise new questions. So the fact that a new technology creates a problem doesn’t negate the advance that technology represents.71 The questions are: is the new problem a better one to have? And can the new problem be solved, in turn? Thus creating the next better problem, with its own solution, etc. This isn’t a failure of progress; it is the nature of progress.
Advances in health and safety require an investment of time, money, and talent. Drugs would get to market faster if we didn’t bother to test them. So there is a tradeoff between the velocity of new drugs and their safety. But that doesn’t mean the development of drug testing protocols in the 1940s and ‘50s slowed progress down. From a techno-humanist perspective, the velocity of new drug introductions alone does not represent progress. We want new drugs, but we also want them to be safe, because both are in the service of healthy lives. We trade off velocity and safety in order to maximize progress in human health.
In the past, solutions to problems often came after the problems were glaring. People were burned by fires, poisoned by drugs, and choked by smog before we solved those problems. This was understandable when scientific knowledge and engineering processes were less advanced, and justifiable in a world where people were already dying at high rates from infectious disease, non-electrical fires, etc. But part of the grand story of progress is that the world overall gets safer over time, and continued progress ought to include continued advances in safety. So today, we are more proactive about safety. This means more effort to predict problems before we observe them, and more testing of new technologies earlier in their lifecycle, before wide deployment.
Consider the ongoing rollout of self-driving cars, vs. the original introduction of cars themselves. “Early cars had weak brakes, tires that blew out, headlights that glared, plate-glass windows that made them easy to flip … no seat belts and often soft roofs or no roofs at all … there were no drivers’ education requirements, no driving exams, no vision tests, no age limits, and in most places no speed limits.”72 Consequently, the fatality rate per motor vehicle was some twenty times higher in 1913 than today.73 Self-driving cars, in contrast, have received extensive safety testing. The leading self-driving car company, Waymo, has caused zero human fatalities, and already has a better safety record than human drivers: a recent study found a 92% reduction in injuries and 88% in property damage.74
Contrast the first drugs with the development of genetic engineering. As late as the 1930s, drugs were sometimes released without safety testing; in one famous disaster in 1937, over 100 people died from a toxic formulation before FDA field agents tracked down and confiscated the remaining supplies.75 In 1975, when recombinant DNA techniques were still a lab experiment, the researchers involved took a very different approach. Some of them realized early on that the technique could create dangerous pathogens, and they proactively called for a voluntary moratorium on certain experiments they considered hazardous, until they could convene a group of experts to discuss risks and mitigation. They did so at the historic 1975 Asilomar conference, where they worked out a set of risk levels and safety procedures appropriate to each level.76 These recommendations formed the basis for the official NIH guidelines, which remarked on the “exemplary responsibility of the scientific community in dealing publicly with the potential risks” and called the process “a most responsible and responsive one.”77 Similar convenings happened in subsequent years, such as a 2015 conference on the use of CRISPR.78
Artificial intelligence is also receiving a tremendous amount of attention for its safety issues, especially relative to its capabilities so far. New AI models at leading labs such as OpenAI and Anthropic receive extensive testing before their release, both internally and from third-party testing agencies (the Underwriters’ Laboratories of AI). OpenAI’s o1 model, for instance, was tested on whether it would give advice on how to commit crimes, whether it could perform cyberattacks, whether it could help people create chemical, biological, or nuclear threats, whether it could exfiltrate itself to another system, and whether it showed propensity to “scheme” against the user to pursue a goal that conflicted with its given goals. It was also given to a “red team” of external experts for open-ended discovery of risks.79
Some argue that we’re still not doing enough for safety, that we are running dangerous risks from AI and biotech. Others argue that safety concerns have gone overboard and are now saddling us with needless overhead. Indeed, both criticisms could be true, if we are focused on minor risks while ignoring major ones, or if we are accruing “safety theater” that placates the public without actually improving safety. My point here is simply to draw the historical trend: we are far more proactive about safety now than in the past. I expect this trend to continue—and, if we can make wise tradeoffs between speed, cost, safety, and other aspects of progress, it will be a good thing.
And this is the other reason I have highlighted the role of science and engineering in creating health and safety. Activists and governments can help solve problems after they arise, as the Clean Air Act required auto manufacturers to eliminate smog. To prevent problems, we rely on the researchers who are actually on the frontiers of science and technology.
I’ll close with a message for them: You are the ones who can see both opportunities and problems earlier than anyone else. You are the ones with the deepest expertise to evaluate risks, and the best view of how to mitigate them. You are the ones who actually bring new technology into the world, and it is up to you to judge when it is ready. Safety is part of your sacred responsibility. This responsibility should not be paralyzing—it should activate the spirit of solutionism, in which we ruthlessly identify problems and then ambitiously seek solutions.
For more about The Techno-Humanist Manifesto, including the table of contents, see the announcement. For full citations, see the bibliography.
Harry Chase Brearley, The History of the National Board of Fire Underwriters, 80–81.
History of Fire Underwriters, 81.
Ibid, 82.
Harry Chase Brearley, A Symbol of Safety, 21.
Symbol of Safety, 99.
History of Fire Underwriters, 186.
Symbol of Safety, 105–6.
Ibid, 103.
Ibid, 111.
Ibid, 27–28.
History of Fire Underwriters, 189.
Ibid, 192.
Symbol of Safety, 95; History of Fire Underwriters, 194–5.
History of Fire Underwriters, 192.
Symbol of Safety, 31.
Ibid, 95.
Ibid, 235.
Ibid, 236.
Bruce Hensler, Crucible of Fire, 55.
O’Shea, Two Minutes with Venus, Two Years with Mercury; Tillis and Wallach, The Treatment of Syphilis with Mercury.
Adams, The Great American Fraud.
Oser, “Toxicology Then and Now.”
Oser, “Toxicology Then and Now.”
US Food and Drug Administration, Appraisal of the Safety of Chemicals in Foods, Drugs, and Cosmetics.
Hays, “Arnold J. Lehman.”
Safety of Chemicals, 17.
Ibid, 17.
Ibid, 63.
Ibid, 17–18.
Ibid, 18, 63.
Ibid, 20–22.
Ibid, 21.
Ibid, 62–63.
Ibid, 67.
Stirling, “Arnold J. Lehman.”
Washington Post, “Toxicologist at FDA Dies.”
Brimblecombe, “Attitudes and Responses Towards Air Pollution in Medieval England.”
Evelyn, “A character of England as it was lately presented in a letter to a noble man of France.” I have modernized spelling, capitalization, and punctuation, and removed gratuitous italics that were popular in the era. Note that Evelyn attributes this work not to himself but to a letter he came across; I am following contemporary scholar Gillian Darley in attributing it to Evelyn directly (Darley, “Britain’s First Environmentalist.”)
Cleaner Cars, 10–11, 24.
Ibid, 11.
Ibid, 9.
Ibid, 8.
Henderson and Haggard, “Health Hazard From Automobile Exhaust Gas in City Streets, Garages And Repair Shops.”
Cleaner Cars, xv.
Chowkwanyun, “Two Cheers for Air Pollution Control.”
Ross et al., “The Impact of the Clean Air Act.”
Mom, The Evolution of Automotive Technology, 209.
Cleaner Cars, 2–3.
Houdry, “World’s Biggest Cleaning Job.”
Wozniak, “The History of Catalysis.”
American Chemical Society, “Houdry Process for Catalytic Cracking.”
Houdry, “World’s Biggest Cleaning Job.”
Cleaner Cars, 44, 56–57; Lee et al, “Lessons from Emission Control and Safety Technologies.”
Cleaner Cars, 51–76.
Ibid, 79–83.
Ibid, 84.
Ibid, 96–97.
Ibid, 105. Based on these figures, I calculate: 300k troy ounces of platinum and 120k of palladium equals 420k troy ounces of rare metals; 12 tons of rock per troy ounce implies 5.04M tons of rock; 5 million vehicles implies one ton per vehicle.
Bauner, “Towards a Sustainable Automotive Industry.” p. 15 (footnote 4)
Mondt, Cleaner Cars, 103.
Mondt, Ibid, 94–95, 102–107.
Farrauto and Heck, “Catalytic converters: state of the art and perspectives.”
Mondt, Cleaner Cars, 93.
Ibid, 115.
Ibid, 126–7.
Ibid, 129–35.
Ibid, 57.
Parrish et al, “Air Quality Improvement in Los Angeles.”
Deutsch, Beginning of Infinity, 436.
Gordon, Rise and Fall of American Growth, 239; quoting Crossen, Cynthia. (2008). “Unsafe at Any Speed, with Any Driver, on Any Kind of Road,” Wall Street Journal, March 3, p. B1.
National Safety Council, “Car Crash Deaths and Rates.”
National Institutes of Health, “Recombinant DNA Research Guidelines,” p. 27904
Baltimore et al, “A prudent path forward for genomic engineering and germline gene modification.”
Open AI, “ OpenAI o1 System Card.”
This was a great piece Jason! I too enjoyed the case studies. You focused on drugs in this piece, but I imagine food has had a similar path? I’m thinking that we have food safety testing and nutrition labels now, for example.
Thank you for writing this! I think the point you are making is important, and the in-depth examples were immensely useful in making that point. I will certainly remember those examples for my own thoughts and conversations about this and similar topics! :)