alphaspirit - stock.adobe.com
Major Milestones in Medicine, Drug Development in Recorded History
The past few centuries have witnessed several revolutionary milestones in modern medicine and drug development that have drastically improved human health.
Because the past few centuries have witnessed several revolutionary scientific advances, research into the mechanisms responsible for infectious diseases and the development of drugs and vaccines has led to the eradication of once-devastating illnesses such as polio and smallpox.
In this time of significant scientific innovation and the promise of improved human health, researchers’ understanding of chemistry and biology is ever-expanding alongside increased identification of novel modalities and targets.
Smallpox Vaccine (1798)
While vaccines do not fall under the classification of drugs, they are a type of preventative medicine meant to boost immunity and improve human health.
The smallpox vaccine was the first contagious disease vaccine. In 1796, a British doctor, Edward Jenner, discovered that a mild cowpox virus infection provided immunity against the smallpox virus in humans. Until the emergence of the modern smallpox vaccine in the 20th century, the natural cowpox vaccine served as the preferred vaccination method.
From 1958 to 1977, the World Health Organization (WHO) conducted a program focused on eradicating smallpox globally. Despite their best efforts, smallpox was still widespread in South America, Africa, and Asia. By the time the program intensified in 1967, smallpox had been eliminated in North America (1952) and Europe (1953). The program led to complete global eradication in 1977.
Although smallpox vaccinations are no longer available to the general public, the United States has stockpiled the vaccine if a future outbreak occurs.
Morphine (1827)
In the early 1800s, morphine was isolated from opium by a German pharmacist’s assistant, Friedrich Wilhelm Adam Serturner. Although Merck began morphine commercial marketization almost two decades later, the development of the hypodermic syringe in 1852 fueled an increase in injectable morphine usage. This approach became the standard method of reducing pain during and after surgical procedures.
Although morphine is considered an addictive pain management option, some experts agree that the benefits of opioids greatly outweigh the disadvantages.
The discovery and widespread use of morphine paved the way for a new generation of over-the-counter and prescriptions pain management drugs we use today.
Ether (1846)
In 1540, the compound diethyl ether was first synthesized by a Prussian botanist, Valerius Cordus. Before its breakthrough discovery as an anesthetic in 1846, surgical operations occurred with little to no pain relief, causing patients severe emotional distress and suffering.
Vaporized ether suppresses brain activity and relaxes muscles so that invasive operations can occur without the patient experiencing pain. While more safe and effective anesthetics have been developed over the past several decades, ether paved the way for modern anesthetic development.
Aspirin (1899)
In 1897, Aspirin was first derived from acetylsalicylic acid by a German chemist at Friedrich Bayer and Co., Felix Hoffman, to alleviate his father’s rheumatism after reading the medical benefits of salicin reported in The Lancet. Only two years later, Bayer began manufacturing and distributing acetylsalicylic acid in powder form under the brand name Aspirin.
Today, Aspirin extends far beyond pain management and prevents heart disease and stroke as a blood thinner. However, the popularity of Aspirin has severely declined due to the development of other well-known pain relievers like acetaminophen (1956) and ibuprofen (1962).
Insulin (1923)
People with advanced diabetes cannot produce sufficient amounts of insulin — a hormone necessary for converting sugar to energy. Before insulin was discovered in 1922, individuals dieted to near starvation to ward off symptoms.
Synthetic insulin is the result of the combined efforts of multiple scientists over the span of several years. In 1889, two German researchers realized that removing a dog’s pancreas gland caused the animal to develop diabetes symptoms and die soon after.
Then, in 1910, Sir Edward Albert Sharpey-Shafer, an English physiologist, theorized that only one chemical was missing from the pancreas of patients with diabetes, naming this chemical insulin.
In 1921, a Canadian surgeon named Frederick Grant Banting and his assistant Charles Best discovered a method of removing insulin from a dog’s pancreas. The recovered insulin showed an important proof of concept by keeping another dog with severe diabetes alive for 70 days only after running out of insulin.
With this notable success, this team of researchers, with the help of J.B. Collip and John Macleod, developed a method of refining insulin from cattle, winning Banting and Macleod a Nobel Prize in 1923.
Also, in 1923, Eli Lilly and Company became the first company to produce insulin commercially. And in the following decade, Novo Nordisk Pharmaceuticals, Inc. became the first manufacturer of slower-acting insulin.
While the first genetically engineered synthetic “human” insulin was produced by E. coli in 1978, Eli Lilly and Company became the first to offer the commercially available name-brand biosynthetic human insulin, Humulin, in 1982.
Today, insulin is offered in many forms, from regular human insulin identical to what the body produces to ultra-rapid and ultra-long-acting formulas.
Chemotherapy Drugs (1940s)
Chemotherapy was accidentally developed at the beginning of the 20th century when mustard gas was used as a weapon in both World Wars. During World War II, researchers discovered that individuals exposed to nitrogen mustard had significantly reduced white blood cell counts, prompting them to wonder if it could also halt the growth of rapidly dividing cancer cells.
In the 1940s, two Yale pharmacologists, Alfred Gilman and Louis Goodman, researched the effects of mustard agents on lymphoma and found that they significantly reduced tumor masses for a few weeks after treatment. Unfortunately, while mustard gas has been proven to kill cancerous cells in various studies, it also substantially damages healthy ones.
As a less toxic alternative, methotrexate — an immune-system suppressant — was the first drug developed to cure a rare tumor called choriocarcinoma. Methotrexate is still used today, either alone or with other treatments, as an effective chemotherapy drug to treat several cancers such as lymphoma, leukemia, osteosarcoma, and others.
Over the past several decades, cancer research has made significant advances in chemotherapy drug development, which has led to improved patient survival and decreased mortality rates.
Penicillin (1942)
In 1928, Alexander Fleming discovered the first antibiotic used in medicine — penicillin — after realizing mold produced a self-defense chemical capable of killing bacteria.
While this marked a turning point in history and gave way to treating various bacterial diseases such as pneumonia and scarlet fever by interfering with bacterial cell walls, inappropriate use of penicillin has led to antibiotic resistance, causing bacteria to evolve and evade antibiotic mechanisms.
Chlorpromazine (Thorazine, 1951)
In 1951, French scientist Paul Charpentier synthesized the first antipsychotic drug, chlorpromazine. Just 10 years later, chlorpromazine was used by around 50 million people under the name brand Thorazine, produced by Smith-Kline & French. The widespread use of this drug enabled the future discovery of anxiety and depression medications.
This drug discovery represented a turning point in the practice of psychiatry, leading to a psychopharmacological revolution, and allowed for further understanding of how impulses are passed to neurons in the brain.
Polio Vaccine (1955)
Poliomyelitis, a contagious and life-threatening disease caused by the throat and intestinal tract poliovirus, was once a leading cause of disability throughout recorded history. Since the widespread use of this vaccine, created by American medical researcher Jonas Salk in 1953, polio has been eradicated in many parts of the world.
Although polio was eliminated in the US in 1979, the US continues to vaccinate children against the poliovirus since the disease is still prevalent in other countries.
HIV Protease Inhibitors (1990s)
In 1981, the human immunodeficiency virus (HIV) was first documented in San Francisco and New York City. Then, four years later, HIV was declared the causative agent of the acquired immunodeficiency syndrome (AIDS).
In 1987, highly selective HIV protease antagonists were identified, known as protease inhibitors. Although protease inhibitors are not the first type of HIV drug, when combined with other kinds of AIDS drugs for the first time, HIV levels remain low, preventing the development of AIDS.
Between 1989 and 1994, researchers at Hoffman-La Roche Inc., Abbott Laboratories, and Merck & Co., Inc. invented several protease inhibitors used to decrease a patients’ viral load. Today, the FDA has approved 26 AIDS drugs, and 10 of those are protease inhibitors.
COVID-19 Vaccine (2020)
When the SARS-CoV-2 virus emerged in Wuhan, China, in 2019, unprecedented global measures were necessary to contain the spread of this deadly virus until a vaccine was widely available. And by December 11, 2020, the BioNTech/Pfizer vaccine, Comirnaty, became the first to receive emergency use authorization by the FDA.
Today, billions of COVID-19 vaccine doses have been safely administered by healthcare workers, saving the lives of many.
In the last decade alone, a better understanding of fundamental mechanisms that cause disease has drastically improved the ability to treat, diagnose, and prevent common conditions.
Because millions of lives are at risk, drug and vaccine development is arduous. However, many new technologies and medical interventions offer new options for care and continue to accelerate change across the healthcare industry.