For most of the nineteenth century, drugs were not highly effective, leading Oliver Wendell Holmes, Sr. to famously comment in 1842 that “if all medicines in the world were thrown into the sea, it would be all the better for mankind and all the worse for the fishes”.
During the First World War, Alexis Carrel and Henry Dakin developed the Carrel-Dakin method of treating wounds with an irrigation, Dakin’s solution, a germicide which helped prevent gangrene.
In the inter-war period, the first anti-bacterial agents such as the sulpha antibiotics were developed. The Second World War saw the introduction of widespread and effective antimicrobial therapy with the development and mass production of penicillin antibiotics, made possible by the pressures of the war and the collaboration of British scientists with the American pharmaceutical industry.
Medicines commonly used by the late 1920s included aspirin, codeine, and morphine for pain; digitalis, nitroglycerin, and quinineinsulin for diabetes. Other drugs includedantitoxins, a few biological vaccines, and a few synthetic drugs. In the 1930s antibiotics emerged: first sulfa drugs, then penicillin and other antibiotics. Drugs increasingly became “the center of medical practice”. In the 1950s other drugs emerged including corticosteroids for inflammation, rauwolfia alkloids as tranqulizers and antihypertensives, antihistamines for nasal allergies, xanthines for asthma, and typical antipsychotics for psychosis. As of 2008, thousands of approved drugs have been developed. Increasingly, biotechnology is used to discover biopharmaceuticals. for heart disorders, and
In the 1950s new psychiatric drugs, notably the antipsychotic chlorpromazine, were designed in laboratories and slowly came into preferred use. Although often accepted as an advance in some ways, there was some opposition, due to serious adverse effects such as tardive dyskinesia. Patients often opposed psychiatry and refused or stopped taking the drugs when not subject to psychiatric control.
Governments have been heavily involved in the development and sale of drugs. In the U.S., the Elixir Sulfanilamide disaster led to the establishment of the Food and Drug Administration, and the 1938 Federal Food, Drug, and Cosmetic Act required manufacturers to file new drugs with the FDA. The 1951 Humphrey-Durham Amendment required certain drugs to be sold by prescription. In 1962 a subsequent amendment required new drugs to be tested for efficacy and safety in clinical trials.
Until the 1970s, drug prices were not a major concern for doctors and patients. As more drugs became prescribed for chronic illnesses, however, costs became burdensome, and by the 1970s nearly every U.S. state required or encouraged the substitution of generic drugs for higher-priced brand names. This also led to the 2006 U.S. law, Medicare Part D, which offers Medicare coverage for drugs.
As of 2008, the United States is the leader in medical research, including pharmaceutical development. U.S. drug prices are among the highest in the world, and drug innovation is correspondingly high. In 2000 U.S. based firms developed 29 of the 75 top-selling drugs; firms from the second-largest market, Japan, developed eight, and the United Kingdom contributed 10. France, which imposes price controls, developed three. Throughout the 1990s outcomes were similar.