Have you ever wondered what sex during the Black Plague was like? It sounds sick and twisted, but life goes on, as they say, even during an epidemic. People are still people, even when a ton of people are swept away (atleast75 million died during the epidemic). Engaging in physical relationships during the Black Plague (another common name for it) was in many ways a lot like how it was during the rest of the Middle Ages, but the extreme conditions led to some extreme expressions of sexuality.
“Bedroom activity” during the Black Plague was in some ways pretty wild, with some “revelers” deciding to hump the rest of their seemingly short lives away. But doctors at the time also told people to avoid overexerting themselves in the bedroom because they thought the “bad air” would reach them easier if they did. Read on to learn more about what making love during the Black Plague was really like.
There Were “Gatherings” In Graveyards
The Black Plague was a stressful time to be alive, for obvious reasons. One way to cope, according to historian David Herlihy inThe Black Plague and the Transformation of the West, was by celebrating life in cemeteries. “Group activities” were one of the ways people celebrated life. At Avignon’s Champfleur cemetery, for example, things got so bad that a papal official had to threaten the “fornicators and adulterers” with excommunication for committing “unseemly acts” on the graves.
Street walkers even took advantage of this desire by hanging out at cemeteries. It wasn’t all fornication: revelers also dared to dance, fight, throw dice, and play other games among the graves as well.
Medical Experts Advised Limited “Physical Activity”
Medical logic at the time said that too much “physical activity” “overheated the body,” according to Joseph Patrick Byrne’sThe Black Plague, and this allowed “bad air” to enter the body through one’s pores, increasing the chances of catching the plague. Heavy breathing during the act might also lead to inhaling too much of that same “bad air.” A German physician even advised that “all physical exertions and emotions of the mind,” including running, jumping, jealousy, and promiscuity, should be totally avoided or risk catching the dreaded Black Plague. Whatcouldpeople do? They could spend their downtime “relating tales and stories and with good music to delight their hearts.”
“Selling Yourself” Was Institutionalized
As the casualty toll of the plague increased, working girls benefited more and more, according toJeffrey Richards.They began to enjoy a “seller’s market” due to a general lack of labor in the era, leading to “a general improvement of their conditions.”
Leah Lydia Otiswrote that as the Black Plague waned, there was a “quantum leap in the institutionalization of [working girls.]” Municipally-owned “parlors” were built, complete with “royal safeguards.” Otis did note, however, that the demand for girls began to wane at that time, as well.
Some Thought Immorality Helped Cause The Plague
Joseph Patrick Byrnewrote that many lawmakers at the time adopted the “Christian belief that sin angered God, who expressed his divine wrath through plague,” and they turned those beliefs into legislation. Many older “moral laws” essentially became just plain laws. This meant sexual immorality was heavily legislated. This “sanitary” legislation targeted sodomy and selling one’s body in particular. In Florence, for example, working girls were “kicked out” of the city in the waning years of the Black Plague. When the industry reemerged in the decades that followed, they were still forbidden to work on the streets. Certain establishments, however, were still allowed to legally operate.
There Was Still An Active Gay Subculture
According to theEncyclopedia of Homosexuality, Volume 2, a “vital urban subculture” of homosexuals existed during the Black Plague. It wasn’t until the Renaissance that “more detailed records of the life and attitudes of homosexual men and women” emerged, but this vital subculture was alive, despite “only fleeting glimpses of it in the literature” of the period.
A few decades before the Black Plague,King Edward IIof England was murdered, and centuries-old rumors say he was executed for homosexual activity. (Mel Gibson’sBraveheartreceived a lot of criticism for its negative portrayal of Edward.) The belief that “sexual immorality” such as “sodomy” helped cause the Black Plague surely was another factor in keeping the subculture hidden during the period.
‘Pseudo-Flagellants’ Performed Acts In Public
So-called “flagellants” during the Black Plague were, according to Professor Mark Damen ofUtah State University, “professional self-torturers” who went around whipping themselves for a fee in order to “bring God’s favor upon a community hoping to avert the bubonic plague.” They were literal whipping boys that people employed to buy “remission from sin.” The Church, of course, outlawed this behavior, but that didn’t have do much to stop the practice. There was also anothergroup of lesser-known “pseudo-flagellants”that went from town-to-town performing “physical acts” in public for a fee. The Church outlawed them, as well.
Incidents Of Incest Increased
InDomestic Violence in Medieval Texts, Eve Salisbury, Georgiana Donavin, and Merrall Llewelyn Price wrote that incest in England actually increased during the Black Plague. Why? Simple arithmetic. The plague “destroyed between one-third and one-half” of the population, making exogamy (marrying only outside their clan or community) “improbable.” The problem, Donavin wrote, wasn’tkeepingcousins from marrying, but instead “finding living cousins with whom one might preserve the patrimony.” A lot of noble families died off during the plague years, meaning “intrafamilial marriages greatly increased.”
Fines For Fornication Increased
Richard M. Smith wrote in Land, Kinship and Life-Cyclethat the severity of fines for fornication in England increased as the severity of other legal fines generally decreased in the middle of the Black Plague period (1349). Smith interpreted the high fines during this period as a punishment for acts that were seen as morally improper. The courts, essentially, decided to ramp up the punishment for immorality in response to the Black Plague. Blame the fornicators, basically. Smith did note, however, that attitudes about unseemly acts such as fornication, and thus the inclination to increase the fines for such acts, may have been changing even before the plague struck.
The Most Intense Symptoms Suffered By Victims Of 14th-Century Black Plague
Responsible for eliminating anywhere between 30 to 60% of Europe’s population between 1346 and 1353, the bubonic plague was a cause of terror. Gruesome symptoms led to mass panic and widespread fear. What happened to people with the black plague? From oozing boils to decaying skin, gross symptoms of the black plague were a common sight in Europe during the 1300s.
The disease occasionally crops up again today. While not always fatal, bubonic plague symptoms can have lasting consequences for sufferers. Early black plague signs include odd lumps and bumps, but also very common ailments. Many plague sufferers initially experience normal symptoms of a cold or flu, like a fever and chills, only to have their health start deteriorating rapidly. Learning about the black plague will leave you second-guessing waiting to see a doctor the next time you come down with a seemingly mild sickness.
Gangrene Is An Unpleasant Side Effect
Sufferers often end up with gangrene as a result of the virus, which is sometimes treated via drastic measures like amputation. Gangrene causes the skin to turn shades of blue, purple, green, red, or black. Swelling and blisters may also occur, and these emit a foul-smelling pus. Skin may also become cold and tender.
One reason amputation is often necessary is that gangrene can lead to septic shock, an often fatal complication.
Bumps And Boils Eventually Start To Ooze
After their initial appearance, the egg-sized lumps found on plague sufferers get worse. The bumps and boils spread throughout the body. Over time, they begin to rupture and emit blood and pus.
One Complication Will Promptly Shut Down Bodily Functions
Disseminated intravascular coagulation is a medical condition sometimes caused by the plague. This is a serious and potentially fatal complication in which blood clots throughout the body and – as a result – internals organs begin to shut down. This was a death sentence in the early days of the plague, and is often still fatal today.
However, the condition is sometimes successfully treated via a medically induced coma.
Bumps The Size Of An Egg Present As The First Symptom
If you’ve contracted the plague, the first symptom is a little hard to miss. You develop what are called “buboes,” which generally develop a week after you’re exposed to the virus. These are large bumps, about the size of a chicken egg, that are found around the groin, armpit, or neck. In addition to being massive, they’re sensitive.
The bumps are also warm to the touch and tender.
Sufferers Engaged In Self-Flagellating Religious Rituals
When you’re suffering intensely for unknown reasons, it’s not uncommon to look to the skies for an answer. In the 1300s, some sufferers concluded the Black Death was a punishment brought on by an angry God for the impurities in their own souls. Their solution? Intense acts of self flagellation.
Sufferers, especially those in the upper class, would march from town to town. In front of a public audience, they would beat one another and themselves with heavy straps of leather covered in shards of metal. This ritual was repeated three times a day for a 33-and-a-half-day period.
As more and more people began participating, the pope caught wind. Concerned the self flagellants could usurp his power, he condemned the practice. It fizzled out shortly thereafter.
Mutant Bacteria Was Especially Harmful
Plague bacteria spreads rapidly throughout the body, shutting down nearly every vital function. While complications like gangrene and dehydration often led to the end for sufferers, many people were more or lesspoisoned.
This is due to yersinia pestis, a mutant bacteria that causes the plague. This bacteria is particularly violent as it is unable to survive outside a host, and it can penetrate and hide in a host’s cells. In order to survive, the bacteria multiplies quickly and disables a sufferer’s immune system. Yersinia pestis bacteria then clot underneath the skin, in hopes of being picked up by a passing flea.
Even Survivors Have Lasting Side Effects From The Vomiting
Vomiting is par for the course for a wide variety of common illnesses, but this is no minor ailment when it’s related to the black plague. Depending on the duration of the infection, the consequences of months of acid reflux and vomiting can last for years.
Take the case of Katie Simon, a woman who caught the plague on a backpacking trip she took shortly after college in the early 2000s. Her stomach was afflicted that she had to stick to a strict diet comprised of mostly bland foods free of gluten, dairy, alcohol, caffeine, and processed sweeteners. Her upper digestive system was completely inflamed, and she had ulcers covering her stomach and esophagus. Recovery took two and a half years.
Sufferers Bleed Pretty Much Everywhere
In the disease’s later stages,bleeding is common. Septicemic plague occurs when plague bacteria begin multiplying in a sufferer’s body. They may bleed from the nose, mouth, rectum, or even under the skin.
Extremities Blacken As Bacteria Multiply
After an initial infection, bacteria begins to multiply in a sufferer’s bloodstream. This can cause a number of side effects associated with more common illnesses, such as fever, chills, and diarrhea. However, one symptom distinct to the black plague is a change in body color.
Sufferers often experienced the blackening of their fingers, toes, and nose.
Overall Skin Color Sometimes Changes
Blackening of the extremities is a common side effect, but some sufferers experience complete changes in skin color.
Take the case ofPaul Gaylord, an Oregon man who contracted the plague from his cat in 2012. After the initial fever, his skin began to turn grey throughout his body. This caused his wife to rush him to the hospital, where he luckily received life-saving treatment.
The Initial Symptoms Mimic Those Of Normal Colds And Flus
One of the scariest things about the black plague is that initial symptoms aren’t really that different from the run-of-the-mill flu or cold. You may experience fever, shaking, general weakness, and increased sweating.
Next time you experience these symptoms, especially if you’ve been near rats or fleas recently, you might want to see a doctor just in case.
How Humanity’s Response To Epidemic Disease Has Stayed The Same Throughout History
For millennia, societies have faced destructive epidemics that caused a major loss of life. Although the germs that cause epidemics change over time, and people may pretend humanity has too, responses to epidemics show startling similarities across centuries. During Justinian’s Plague, 40% of Constantinople’s population perished. When the plague of Athens struck, it took out up to 30% of the city. And when the Black Death swept Europe in the 14th century, it ended the lives of millions, with population losses of over 50% in some areas. Even epidemics with lower mortality rates cause major disruptions, like the Spanish flu of 1918. But regardless of era or disease, people have shown many of the same responses: flee from the source of the danger, look for a scapegoat, propagate pseudoscientific cures, and question scientific authority.
In a time of crisis all bets are off; we fall into the most primal fears and patterns. And though people will do what they can to save themselves, not all behaviors are bad. Some may even better prepare humanity for future epidemics and crises.
People Try To Find A Scapegoat, Leading To Instances Of Extreme Prejudice And Racism
When epidemics spread, communities often look for a scapegoat. During the Black Death, Europeans blamed Jewish communities for spreading the disease. One chronicle reported, “Death went from one end of the earth to the other, on that side and this side of the sea . . . In some lands everyone died so that no one was left.”
In Strasbourg, the plague took out thousands, and Christians blamed the city’s Jewish population. “On Saturday – that was St. Valentine’s Day – they burnt the Jews on a wooden platform,” the chronicle recorded. “There were about 2,000 people.” The same happened in other cities. “In some towns they burnt the Jews after a trial, in others, without a trial.”
Just as the Jews became scapegoats during the plague, racism against Chinese people increased during the 2003 SARS epidemic and the 2020 coronavirus epidemic because they both originated in China. In New York City, during the 2020 epidemic, tourists have even avoided Chinese restaurants for fear of catching the virus despite there being no evidence to suggest people of East Asian origin are more likely to spread the disease.
These are not the only cases in modern history, however, and people of East Asian origin are not the only targets of prejudice. During the Ebola outbreak of 2015, people from West Africa were targets of xenophobia; while the LGBTQ+ community was stigamatized during the AIDS crisis of the 1980s.
People Peddle Pseudoscientific Cures And Spread Disinformation
In 1900, the bubonic plague reached the shores of America. An outbreak in San Francisco threatened to spread the disease. Doctors warned that bacteria spread the plague, but government officials undermined their efforts by questioning the science.
In California, Governor Henry Gage was skeptical about germ theory. He couldn’t personally see the bacteria that caused plague, and thus denied its existence. As the plague swept through Chinatown, white San Franciscans claimed they were immune from the disease, blaming its spread on poor hygiene. But they, too, soon faced a plague epidemic, regardless of their scientific skepticism and denial.
Similarly, during Justinian’s Plague in the sixth century, people turned to cures with no basis in science. Some claimed taking cold baths protected people from plague, while others sold magic amulets.
Similar pseudoscientific claims continue into the 21st. During the 2015 Zika virus epidemic a rash of conspiracy theories made their way through social media platforms. One such claim blamed one of the viruses symptoms, microcephaly, a condition which causes babies’ heads and brains to develop abnormally, on MMR and DTAP vaccines in an effort for pharmaceutical companies to profit off of Zika vaccines.
Psedoscientific claims can cause adverse effects to individual and societal health. In the case of the Zika virus, doctors and scholars looking to limit the disease’s reach claimed the disinformation threatened the legitimacy of healthcare institutions, potentially exposing more people to the disease as people refused to trust healthcare professionals. In other cases, the pseudoscientific claims and cures, like drinking bleach, have caused more direct health issues.
Due To Extreme Population Loss, Affected States Experience A Loss In Military And Political Strength
In the sixth century, the Byzantine Empire experienced a horrific plague that caused a massive drop in population. Under the Emperor Justinian, the Byzantines had expanded their borders and fought to regain parts of the western Roman Empire. But Justinian’s Plague threatened to destroy the empire.
Thanks to population loss, the Byzantines could no longer defend their overseas territories. In addition to the military losses, the Byzantines also endured economic and administrative problems that decreased the empire’s political power. Though the empire survived the plague, it never again achieved the reach it had under Justinian.
Justinian’s Plague was not the only Ancient historical case. In the fifth century BCE, the Greek city-states of Athens and Sparta faced each other through a nearly 30-year war. During the second year of the war, however, while Sparta had laid siege on the Athens, the defending city was swept by an unknown epidemic disease, which ended a third of its entire population including the states leader, Pericles.
Some historians and scholars have attributed this unexpected loss of life to Athens’s ultimate defeat and the eventual decline of Ancient Greece’s cultural output. Others would not go that far, citing Athens’s eventual regrowth and victories throughout the battles, but they do agree the city state did lose prestige and power due to the political aftermath of the disease.
Similarly, historians have attributed Germany’s loss at the end of WWI to the emergence of the Spanish Flu epidemic in the summer of 1918. Beginning the year with a military advantage against the Triple Entente and looking to end the war before American soldiers could be deployed, the Germans launched an offensive in hopes of breaking through enemy lines and reaching Paris. They, however, lost about half a million men due to the virus, making it impossible for the army to make that final charge.
Because They Feel A Personal Responsibility To Help, Healthcare Personnel Experience The Worst Of The Disease
Doctors, nurses, and other healthcare workers often experience high mortality rates during epidemics. The famous plague doctor costume, developed in the 17th century as the bubonic plague continued to ravage Europe, attempted to protect doctors from miasmas, or disease-transmitting clouds. In modern epidemics, doctors wear personal protective equipment, including masks and gloves.
Due to the close proximity and extended time healthcare personnel spend among epidemic diseases, the high mortality rate among healthcare workers is still the norm despite technological advances.
In the fifth century BCE, Athens experienced a horrific plague that took out one in three Athenians in a single summer. As the Peloponnesian War raged, Athenians battled against an unknown enemy riskier than combat. Doctors experienced an even higher mortality rate during the plague of Athens. That year, according to Thucydides, “Neither were the physicians at first of any service, ignorant as they were of the proper way to treat it, but they died themselves the most thickly, as they visited the sick most often.”
But during the 2015 Ebola epidemic in West Africa, doctors and nurses also perished at a much greater percentage the rest of the population. The World Health Organization attributed the high mortality to fairly regular problems within epidemics, including shortages of both medical supplies and staff, improper use of equipment, and longer than recommended exposure to the disease most often due to a sense of duty to help.
Societies Affected By Major Epidemics Have Resorted To Mass Burial
When the Black Death swept across Europe, millions perished in its wake. And the disease’s mortality rate left societies with a troubling logistical problem: How should they dispose of the bodies piling up in the streets?
In Lincolnshire, England, Black Death victims were buried in a mass grave. Discovered in 2013, the grave contains the bodies of nearly 50 people. The community apparently left the grave open and filled it as people perished. According to a journal article in Antiquity, the grave was “filled over the course of several days or weeks.”
Large cities experienced an even more critical problem. According to a 14th century Florentine chronicle, bodies were thrown into deep trenches every night. “The next morning, if there were many [bodies] in the trench, they covered them over with dirt. And then more bodies were put on top of them, with a little more dirt over those; they put layer on layer just like one puts layers of cheese in a lasagna.”
Similar action was taken during the 1918 Spanish Flu epidemic. In 2015, a Pennsylvania Department of Transportation contractor looking to widen the 61 Freeway dug into a 2.25 acre field in Schuylkill County, 100 miles northwest of Philadelphia. Many of the regions residents who perished from the Spanish Flu were found to be buried without caskets. Historians and archaeologists believe there was such a high mortality rate the grave diggers and casket makers could not keep up with the demand, forcing them to unsystematically bury the victims in a large pit.
A Certain Contingent Of People Resort To Hedonism When Death Seems Inevitable
In an era before people understood bacteria and viruses, epidemics brought even more confusion. And when succumbing to the disease seemed inevitable, some people cast off social restrictions and turned to hedonism.
Boccaccio described the response to the Black Death in Florence. Some shunned the sick and avoided any contact with them. “There were those who thought that to live temperately and avoid all excess would count for much,” Boccaccio related. These people secluded themselves from society while “eating and drinking moderately.”
Others took the opposite tactic. They believed “that to drink freely, frequent places of public resort, and take their pleasure with song and revel, sparing to satisfy no appetite, and to laugh and mock at no event, was the sovereign remedy for so great an evil.”
These hedonists traveled from tavern to tavern, “drinking with an entire disregard of rule or measure.”
Boccaccio reported that many of the hedonists perished – but so did those who chose to live moderately.
Similar behavior transpired during the Plague of Athens in the 430 BCE. According to Thucydides:
Men now coolly ventured on what they had formerly done in a corner, and not just as they pleased, seeing the rapid transitions produced by persons in prosperity suddenly dying and those who before had nothing succeeding to their property.
So they resolved to spend quickly and enjoy themselves, regarding their lives and riches as alike things of a day. Perseverance in what men called honor was popular with none, it was so uncertain whether they would be spared to attain the object; but it was settled that present enjoyment, and all that contributed to it, was both honorable and useful. Fear of gods or law of man there was none to restrain them.
In The Wake Of Epidemics, Artists Focus On Macabre Themes
Throughout history, epidemics shape culture as well as society. After living through an epidemic, artists in many eras turn to macabre themes.
During the Black Death, artists emphasized the end of life itself. Historian Frank M. Snowden says that the plague “had a transformative effect on the iconography of European art.”
Artists drew the “dance of death,” showing skeletons reveling, and also emphasized perishing through including hourglasses in their art. Art reminded viewers that there was no escape from the inevitable.
People Change Their Personal Religious Beliefs And Practices
A dangerous epidemic can either shake people’s faith or reinforce it. During the Black Death, Europeans worried the disease had been sent by God as a punishment for their sins. Although many flocked to church for protection, faith didn’t protect Europeans from the plague.
In the 14th century, the plague swept through religious communities, wiping out entire monasteries and convents. The pope himself withdrew from society instead of stepping up as a religious leader. Zealots like the flagellants swept across the continent, atoning for sin by publicly whipping themselves.
During modern epidemics, backed by knowledge of germ theory, religious communities modify traditions to stop the spread of disease. Others simply stop attending church to avoid exposure to large crowds.
Societies Isolate And Quarantine Those Infected
For centuries, people separated the sick from the healthy. The Old Testament even lists rules for isolating lepers.
But quarantines in the modern sense began in the 14th century during the bubonic plague. Venice, a trade hub in the Mediterranean, established the first quarantine by banning ships from the city for 40 days after arriving. Our word for “quarantine” comes from the Italian word “quaranta” which means 40.
The idea quickly spread. In the second major plague outbreak in 1374, Milan adopted a quasi-quarantine by sending plague victims to a field outside the city, where they remained until they recovered or perished. The coastal town of Ragusa on the Dalmatian coast created its own quarantine station, an idea that quickly caught on. Several islands in the Venetian lagoon were used as quarantine stations for centuries.
Quarantines continued into the 20th century. During the Spanish flu, one city protected itself from influenza by shutting down the roads and quarantining anyone who arrived by trains. In that city, Gunnison, Colorado, no one perished from influenza.
People Practice Healthier Behaviors
Educating the public about the spread of disease has been an important tool in fighting epidemics. In the early 20th century, public health campaigns warned people not to carelessly spit, cough, or sneeze, since these helped spread infectious diseases like influenza.
In the 1980s, public education was an important tool in fighting the AIDs epidemic. To slow the spread of the disease, these campaigns focused on changing at-risk behavior. These efforts warned against sharing needles or having unprotected sex. After peaking in the early 1990s, the number of HIV cases in the US dropped in large part thanks to public education.
People Migrate En Masse To Escape The Disease
During epidemics, large groups of people often migrate to avoid disease. Boccaccio’s Decameron tells the story of Italians who fled Florence during the plague.
In 14th century Egypt, the bubonic plague destroyed villages and forced many from rural areas to migrate into Cairo. In one district in Upper Egypt, a survey counted 6,000 workers in the fields before the plague. Thanks to those perishing and migrating, only 116 laborers remained after the plague.
Similarly, in January 2020, about 5 million people left the city of Wuhan, China, before the lockdown. According to Mayor Zhou Xianwang, millions fled days before the quarantine went into place. However, migrants fleeing an epidemic potentially spread the disease, undoing the impact of quarantines.
While Bad For The Economy In The Short Term, Epidemics Tend To Bring Positive Long-Term Economic Changes
In the short term, epidemics bring massive economic disruptions. During Justinian’s Plague, trade nearly came to a halt and agricultural prices soared thanks to fewer farmers. In Constantinople, where up to 40% of the population perished, the Byzantine Empire faced an economic crisis when tax revenues collapsed.
During the Black Death, the high mortality rate meant fewer laborers in the fields. In the short term, that meant higher prices. The bubonic plague, however, brought some unexpected long-term improvements to Europe’s economy. Thanks to a labor shortage, wages rose in the decades after the plague. In England, wages roughly doubled. In Suffolk, laborers made 67% more for reaping after the Black Death.
Increased wages for the laboring classes meant higher spending and a greater standard of living. Agricultural workers could suddenly afford “luxury” items like butter. In cities, people benefited from more disposable income. In effect, the Black Death redistributed wealth from the aristocrats to the peasants and urban workers, helping drive Europe’s economic engine for centuries.
The Bubonic Plague Ravaged San Francisco In The 1900s—And The Government Tried A Massive Cover Up
The San Francisco bubonic plague outbreak was one of the biggest health crises — and controversies — of the 20th-century United States. The plague was a disease most thought had disappeared with the medieval period, but when it resurfaced, it became one of the worst epidemics in US history. The state of California was forced to contend with an illness not yet entirely understood, as government officials and the press actively covered it up. Not only did they have to pioneer treatments for the bubonic plague, but the entire problem was also wrapped up in anti-immigrant and specifically anti-Chinese racism. Some people even referred to it as the San Francisco Chinatown plague, implying a problem specific to the Chinese residents of the city.
But that wasn’t the case. Blaming a select group of people meant that the cause itself — flea-infested rats carrying the same strain of plague causing deaths in China — went unaddressed. The San Francisco plague death toll rose, taking the lives of many of the city’s underprivileged residents. It wasn’t until a second plague swept the city, this time primarily affecting San Francisco’s white population and spreading to further areas of the US, that the root cause was identified and put to rest. While modern-day humanity still deals with tragedy incurred by the flu, the plague of the early 1900s incited racist tension and social horror simply because of the ignorance of its causes and the inexplicable death it left in its wake.
The San Francisco Plague Was The First Major Plague Outbreak In The Continental US
Though the San Francisco plague was hardly the first major illness that residents of the continental US had to address, it was the first major outbreak of the plague. The illness — named the Black Plague or the Black Death when it ravaged Europe in the 1300s and caused some 50 million deaths — made a comeback in the 1800s, seriously affecting China and much of east Asia.
Because of global trade and increasing numbers of people emigrating and immigrating worldwide, it was only a matter of time before epidemics began to spread. However, people did not yet understand exactly how the disease was transmitted — many believed it spread through open wounds, food, or the “miasma” theory, which claimed that diseases like the plague spread through “bad air.” Because of these conflicting, erroneous theories, the world wasn’t prepared to respond to a wide-scale epidemic.
The First San Francisco Plague Victim Was A Chinese Immigrant
The first person to die in the San Francisco plague was Wong Chut King, a lumber salesman and Chinese immigrant who was found unconscious in the flophouse where he lived after suffering an intense fever. In a morbidly preemptive move, he was brought to a nearby coffin shop where he died.
Examination of his body revealed swollen lymph nodes — called buboes, hence the disease’s name — consistent with a plague infection, as well as an insect bite. Flea transmission was not yet a popular theory to explain plague infection, so while it was noted, the report did nothing to quell the subsequently rampant xenophobic explanations for the disease. When microscopic investigation revealed plague bacteria in Wong Chut King’s blood, Chinatown was quarantined to stop the spread, though the fleas and rats that carried the disease were not hindered by arbitrary barriers.
The Outbreak Centered On San Francisco’s Chinatown District
San Francisco’s plague outbreak was concentrated in the Chinatown district, just a few blocks from what is now the Port of San Francisco. Because Chinatown was particularly overpopulated, had poor sanitation, and had many people living in poor conditions, those features were blamed for the outbreak rather than the actual cause: flea-ridden rats brought from plague-stricken China on the ships that came to the harbor. Instead of treating the cause, the city quarantined its Chinese residents.
But quarantines don’t stop rats, and the disease continued to spread outside of the quarantined zones. Because conditions were poor, and racism was rampant, the quarantined residents didn’t get proper medical treatment. Thus, the concentration of infected people, fleas, and rats could grow, leading to even more infections.
Rats From China Probably Carried The Plague To America On Ships
In the late 1800s and early 1900s, scientists and medical professionals weren’t yet sure what caused the plague. The miasma theory, which suggested the disease was spread through “bad air,” was popular, as were suggestions that the plague might travel through contaminated food or open wounds. The bubonic plague is actually transmitted via flea bite, with the carrier fleas often living on rats.
China was dealing with a plague outbreak of its own in the mid-late 1850s, which soon made its way to Hong Kong. Since Chinese immigrants and imports commonly made the journey to San Francisco, it was only a matter of time before the plague reached American soil.
Racism Was Undoubtedly A Factor In The Plague’s Outbreak
The plague itself was a problem, but anti-immigrant sentiment and racism against Chinese people exacerbated the issue. Around this same time, the Chinese Exclusion Act, which prevented Chinese people who were not merchants from immigrating to the US, was extended. Rampant, unfounded anti-Chinese sentiment promoted racist policy, which in turn validated racist viewpoints because of state-sanctioned rules.
In the initial quarantine, white people in Chinatown were told to leave while Chinese residents were forced to stay. Miasma theory, which posited the disease was spread via contact with contaminated air, was blamed for the plague in Chinatown.
Rather than the poor sanitation being a symptom of the area’s poverty, it was instead said to be evidence that the Chinese people themselves were the problem. Some went so far as to claim that a rice-based diet made them susceptible to the plague.
San Francisco’s Governor Denied The Outbreak Out Of Fear It Would Hurt The City’s Reputation
One of the biggest stumbling blocks to stopping the plague’s initial four-year hold on San Francisco was the city government. The governor, Henry Gage, actively denied the city had a problem, fearing it would hurt tourism and trade. Worse, he didn’t just deny the problem — he actively thwarted efforts to stop it. Gage claimed funds were being diverted to stop a plague that didn’t exist. He also suggested Dr. Joseph J. Kinyoun, who led the quarantine effort, had fabricated or caused the plague himself by injecting Chinese residents with the disease.
Though there was little help available for victims, Gage’s denial — as well as his active attempts to stoke tension between Kinyoun’s efforts and the Chinese people whose civil rights were being violated — ensured the plague continued to ravage the city’s ostracized citizens.
The City Ran A Defamation Campaign Against The Officer Who Discovered The Outbreak
Dr. Joseph James Kinyoun was the leader of the plague eradication movement, but, due to both concentrated misinformation efforts and lack of scientific information, his plans to stop the plague were thwarted. Kinyoun was unsure how to curtail the plague’s spread, and he ineffectively quarantined Chinatown in the hopes that it would keep the infection isolated. When animals were infected with the disease and didn’t immediately die, the quarantine was lifted, and the city government, particularly Governor Henry Gage, seized on it as proof that Kinyoun had no idea what he was doing. He warned that shipments of goods could spread the illness out of San Francisco and around the country, leading to other states refusing to accept goods from California.
Gage responded to the lost profits by claiming Kinyoun himself had created the plague by injecting Chinese corpses. Furthermore, he stoked the flames of the poor treatment Chinese people were receiving by encouraging them to fight back. Combined with Kinyoun’s reportedly uptight demeanor, which did not mesh well with the Chinese population he was meant to be helping, his efforts were undercut, and he was eventually transferred. The plague raged on until Dr. Rupert Blue replaced Kinyoun and shifted the treatment from quarantine to pest eradication.
Governor Gage’s Denial Helped Seal His Loss In The Elections
Governor Gage’s actions were not without consequences. Anti-Chinese sentiment, along with the tensions that arose from the quarantine and Gage’s efforts to create a divide between health officials and Chinese residents most threatened by the outbreak, meant that people hid the deceased bodies of plague victims, furthering the infectious spread. People continued to get sick and die, and not just in Chinatown.
Though the disease was concentrated in the slums, it wasn’t thwarted by barbed-wire quarantines. As it became increasingly clear that the illness was real, the citizens of San Francisco realized Gage’s denial and shifting of the blame to Dr. Joseph Kinyoun were actively hurting people. The state’s conservative party refused his nomination, and Gage left office in 1903. He continued to blame Kinyoun for the barring of California goods in other states and the subsequent economic hardship.
The next governor, George Pardee, shared Gage’s concerns regarding public address of the situation but immediately took ownership of the situation. He removed health officials from the case and worked privately to provide medical attention, research, and eradication of the plague.
Some Of The City’s Treatments Actually Spread The Disease Further
Because the plague was not well understood, many of the treatments San Francisco health officials initially used to combat the illness actually made things worse. The quarantine, the first line of defense when the plague’s initial victim was found, concentrated the Chinese-American population in an area with poor sanitation and did nothing for the actual cause of the plague — the rats that were drawn there precisely because of the poor sanitation. The city also used carbolic acid in an attempt to rid the air of the alleged miasma, which drove rats out of sewers and into the streets carrying their plague-riddle fleas with them.
In addition, because relations between the Chinese people of San Francisco and the health officials were so poor, people in Chinatown started hiding the bodies of the deceased. Without proper disposal, the health crisis worsened. Nobody knew that the disease was mostly transmitted through flea bites, thus, nobody was doing anything to stop the spread. All of these factors combined to make the plague even worse.
Another Outbreak Occurred After The 1906 Earthquake
The first plague outbreak occurred from 1900 to 1904, but it wasn’t the last to affect San Francisco in the early 1900s. A second outbreak occurred in 1906 following the enormous San Francisco earthquake, which killed some 3,000 people and displaced 250,000 more. But people weren’t the only ones displaced.
As humans fled their damaged homes, so did infected rats. As they spread, the chaos of the post-earthquake city and the concentration of people in refugee camps meant that the illness was allowed to spread once again. This outbreak was even stronger than the first, but it was more quickly contained thanks to scientific advancements between 1904 and 1906.
The Second Outbreak Was Dealt With Far More Efficiently
While the second outbreak of plague hit San Francisco just two short years after the first one, it was handled much more quickly. Scientific advancements meant people better understood the plague’s transmission, and instead of quarantining infected people, Dr. Joseph Kinyoun’s replacement, Dr. Rupert Blue, targeted the rats. They were rounded up in great rat-catching efforts thanks to financial bounties, which ultimately prevented some of the spread and allowed scientists to test and kill infected rats.
Not only was the threat identified before the plague’s second time around, but the city government also could no longer scapegoat the Chinese residents of the city: all the victims of the second plague were white. Though anti-Chinese sentiment was still prominent, the blame couldn’t be shifted — it wasn’t miasma, diet, or any xenophobic cause, which meant it had to be dealt with scientifically and medically.
119 People Died In The First Outbreak
119 people died during the San Francisco plague outbreak, many of them in the concentrated, quarantined region of Chinatown. Without access to health professionals and pest control, infected victims spread the plague inadvertently but rapidly. Although it’s certainly a deadly disease, the city’s failure to isolate the actual cause and treat infected people ensured its spread and ability to affect more people.
Though it’s not the absolute deadliest outbreak in US history — the introduction of smallpox by European settlers killed almost all indigenous people in what is now North America, for example — the San Francisco plague is one that could have been prevented or had its harm minimized if those in authority had taken action.
Instead, it was allowed to fester in the city’s most impoverished communities, sowing further anti-Chinese sentiment.
Honolulu’s Chinatown Burned To The Ground When Authorities Attempted To Eradicate The Plague
The plague resurfaced in China on the backs of rats in the 1850s and wound up in Hong Kong. Chinese immigrants and imports often passed this way en route to San Franciso via port stopovers in Hawaii. Following one ship’s stop in Hawaii in 1899, a case of the plague broke out in the Honolulu Chinatown, spreading quickly to four more people. In an attempt to contain the plague and eradicate the conditions fostering its rapid spread, The Hawaii Board of Health isolated the victims and quarantined 14 blocks of the area complete with military guards. According to historical record, “To clear contaminated areas, the Board set 41 controlled fires, cleaned and disinfected buildings, burned garbage, filled old cesspools and dug new ones.”
However, when this failed to quell the problem and new cases of the plague arose, the Board set another fire which quickly spun out of control and resulted in all of Honolulu’s Chinatown burning to the ground. The problematic ships nonetheless continued to San Francisco where, because people on the ships did not show plague symptoms, they were allowed to dock. Along with humans also disembarked infected rats, carrying the disease straight into the port city.
The Mysterious ‘Sleeping Sickness’ That Plagued New York In The 1920s
Medical science has come a long way in the last hundred years, but that doesn’t mean every medical mystery has been solved. The cause of the mysterious sleeping sickness that struck New York in the 1920s, Encephalitis lethargica, remains unsolved to this day.
Originally called “the sleeping sickness” because the first few cases involved active people spontaneously falling asleep, it had a wide variety of symptoms and presentations. Some neurologists and pathologists believed it was an unusual manifestation of a concurrent flu epidemic, while others believed it was completely unrelated. It was also theorized that the mysterious illness was related to the polio virus, but nothing conclusive has been proven. This is despite the fact that the disease still pops up in isolated cases around the world.
From its first reported cases in 1915 to its abatement in the 1930s, Encephalitis lethargica is estimated to have infected half a million people in Europe alone. Those who survived often had crippling side effects, with some remaining borderline catatonic for the rest of their lives.
The sleeping sickness of the 1920s was never solved, but it has drawn the attention of scientists for years, including Oliver Sacks, whose work on the disease was adapted into the film Awakenings starring Robert De Niro and Robin Williams.
During The 1920s And ’30s, The ‘Sleeping Sickness’ Perplexed Doctors Around The World
In 1917, as WWI brought mass destruction like the world had never seen, two epidemics began tearing through the shell-shocked world. The first, which would become known as the Spanish Flu of 1918, remains one of the worst pandemics in human history, wiping out an estimated 50 million people and affecting up to half a billion. While this crisis understandably took precedence, it was accompanied by a lesser-known but far more perplexing virus: the sleeping sickness.
The sleeping sickness is believed to have originated in Romania in 1915, but WWI disguised its true impact in Europe. It became more noticeable in New York, and doctors across Europe scrambled to identify the disease. There was plenty of confusion, but no clear answers.
It Was First Described By The Prolific Austrian Neurologist Constantin von Economo
Due to the varied presentations of the disease, the overwhelming demands of WWI, and the appearance of other epidemics around the same time, many doctors treated individual cases of Encephalitis lethargica without realizing they were dealing with a wholly new illness. It took the work of Austrian neurologist Constantin von Economo to fully isolate and categorize the disease.
Economo was a wealthy aristocrat and a Renaissance man. He was the first Austrian man to hold the equivalent of a pilot’s license, and he was trained as an engineer before he moved on to psychiatry.
Economo gave the disease its name, based on what he determined to be its principal manifestation: lethargy and catatonia. In a series of monographs that are studied to this day, he argued his case on the nature of the disease from its categorization as encephalitis (an illness resulting from inflammation of the brain) to its varied pathologies.
Some Of The Afflicted Reported Feeling No Discomfort In Their Sleep
The sleeping sickness prompted, among other things, an examination of the nature of sleep and the difference between sleep and catatonia. However, there was a wide array of experiences reported, some of which were quite pleasant. Take this account, written by Eleanore Carey, who suffered from the illness in 1923:
After two months of illness I was in little pain, in fact… It was so heavenly just to be allowed to sleep, but these people around me seemed determined to prevent my being comfortable! When the idea finally crept through my sleeping brain that I must waken, it seemed to be a physical impossibility. I wanted to be obliging, but I just could not.
Other victims reported dreams and vivid hallucinations. Often, it was possible to wake the patient, but only for a few moments before they succumbed to sleep again.
Some Victims Perished Within Days, While Others Slowly Recovered
During the course of the epidemic, most doctors kept and compared rigorous notes. This allows us to examine a wide variety of case histories, although it is difficult to derive clarity from them because of the wide range of symptoms. However, one thing that does emerge is the unpredictable nature of the sleeping sickness.
One case study details a woman who suffered the sleeping sickness in 1917. She came to the clinic exhausted, and then slid further into somnolence. These symptoms were accompanied by a fever and the paralysis of her right arm. This woman seemed lucky, as her condition slowly improved, and two months later, she was discharged from the hospital with no signs of fever or paralysis. Unfortunately, she passed a month afterward due to pneumonia.
However, not everyone was lucky enough to stage a partial recovery. One young boy was brought to a clinic on April 20 already in a comatose state. He passed on April 28. Both of these cases were common, and there was no known indicator of who might survive and who might not.
Many Who ‘Recovered’ Developed Disabilities Later In Life
During its acute phase, the disease caused somnolence, lethargy, paralysis, fever, and sometimes ended the patient altogether. Some patients, however, made a full recovery, often without any treatment. While this must have come as a relief, the disease was not quite finished with them.
After recovery, many of these patients developed some form of Parkinson’s disease, a progressive nervous disorder that often causes the loss of various forms of muscular control. Parkinson’s can include wide variety of symptoms, and one of the most extreme forms was often seen in survivors of the sleeping sickness: akinesia.
Essentially, this means total body paralysis. Some of these patients remained in a paralytic coma for many years. Robert De Niro’s character in Awakenings, Leonard Lowe, is exactly this sort of patient.
Some Survivors Remained In A Sleep-Like State For Years
At various points during a case of Encephalitis lethargica, it is possible for the patient to fall into a deep, akinetic coma. Chiefly, this was experienced by people who thought they had completely survived the disease, only to develop a worsening case of Parkinson’s years later that culminated in the coma.
Because the cause of the disease was unknown, these comas were thought to be irreversible, and those who suffered them were largely forgotten, as long-term coma patients often are. However, when Oliver Sacks began treating them with L-DOPA, some of them were able to interact with the world for the first time in 40 or more years.
Awakenings’ Showed What Happened To Those Who Temporarily Recovered
Overshadowed as it was by the Spanish Flu and WWI, the sleeping sickness didn’t really come to the public eye until the renowned British neurologist Oliver Sacks wrote his groundbreaking Awakenings, a book that movingly detailed his interactions with patients who suffered from the disease. Sacks was responsible for the proliferation of L-Dopa, the medication that awakened many of these patients. In the book, he describes the challenges of bringing people from the 1920s into the 1960s.
The book was a massive success, and is still considered one of the best pieces of medical nonfiction ever written. It should come as little surprise, then, that it was adapted into a major Hollywood motion picture, starring Robin Williams as a fictionalized version of Sacks, and Robert De Niro as the man who wakes up in a new decade. After it was released, Roger Ebert wrote:
What both the movie and the book convey is the immense courage of the patients and the profound experience of their doctors, as in a small way they reexperienced what it means to be born, to open your eyes and discover to your astonishment that “you” are alive.
Nobody Knows What Caused It
Despite Oliver Sacks’s groundbreaking work with the treatment L-DOPA, there remains no complete cure for Encephalitis lethargica, as the medication often only provides temporary relief. It is perhaps unsurprising that there is no cure, because scientists don’t really understand what causes the sleeping sickness in the first place.
During the outbreak, there were many different theories as to the cause. When the disease first appeared in England, doctors believed that it was a form of botulism. However, botulism is the result of detectable bacteria, and the bacteria simply was not present. Many other theories were proposed regarding the mystifying disease.
At the end of the day, the truth is simply unknown. Papers are still written advancing various theories, but none have been proven or widely accepted.
Cases Continue To Pop Up Today
Because of the mysteries surrounding the illness, it is difficult to truly define when it began and ended. It is also difficult to identify whether there have been any new cases, although conventional wisdom says that at least a few more cases have popped up.
In 2015, a paper was published about a young boy who had contracted HIV. However, when he came to the hospital, he quickly developed symptoms similar to Encephalitis lethargica: lethargy, mutism, and muscular weakness of the optical nerves. There are numerous examples of similar cases, but cautious doctors are reluctant to label them as sleeping sickness.
Some Believe Adolf Hitler’s Parkinson’s Disease Was An Aftereffect Of The Sleeping Sickness
Adolf Hitler may be one of the most over-diagnosed individuals in human history. Due to his morphine addiction, his illness-plagued youth, and his high profile, modern doctors love to retroactively diagnose him with everything from borderline personality disorder to irritable bowel syndrome. While many of these assumptions are sensationalist and based on wild assumptions, there has emerged something of a consensus that the German dictator could well have suffered from Parkinson’s disease.
If that’s true, it is likely that he also suffered from the sleeping sickness in his youth. Due to the relatively late onset of his Parkinson’s disease and his age at the time of the epidemic, the sleeping sickness makes a great deal of sense as the source of his Parkinson’s. A medical paper also makes the case based on his symptoms:
Hitler had oculogyric crises [deviation of the eyes], phenomena only associated with post-encephalitic parkinsonism. In addition, he had dystonic facial spasms, palilalia and a sleep disorder, phenomena more likely to be associated with post-encephalitic than idiopathic parkinsonism.
President Wilson May Have Contracted It In 1919
It is always dicey to retroactively diagnose historical figures, due to both the lack of contemporaneous material and our own modern biases. This is made doubly difficult in the case of someone like Woodrow Wilson, who had a battalion of health issues independent of whatever we may diagnose him with today.
However, that hasn’t stopped a number of modern doctors and pathologists from using historical sources and medical records to diagnose Wilson as a sufferer of Encephalitis lethargica. Edwin Weinstein, one of Wilson’s most prominent biographers, believes that a complicated series of medical disasters accounted for Wilson’s famously altered behavior during the peace negotiations at the end of WWI.
Weinstein believes that after a number of strokes throughout his life, Wilson contracted the flu, which left him vulnerable to the sleeping sickness. Regardless, Wilson didn’t have long to live after the 1919 Peace Conference; he passed five years later in 1924.
Corsets became popular in the 16th century allegedly because Catherine de’ Medici, wife of French King Henry II, banned women with thick waists from attending court. She was a tyrannical monarch, but the Italian-born woman created beauty standards that held up exceptionally well. Corsets remained incredibly common during the Renaissance, up until the 20th century. Certain historians, however, suggested the restrictive undergarments contributed to the patriarchal system of female oppression.
The inherent tightness of the shapewear appealed to the male-dictated understanding of femininity and attractiveness. So unsurprisingly, women wore corsets to their own detriment. They suffered to achieve a smaller, more socially acceptable waist. Many 21st century beauty trends are as focused on the hourglass shape as well, but there are so many things people don’t know about corsets.
Napoleon Claimed Corsets Contributed To The Decline Of Humanity
Napoleon Bonaparte not only quested to rule Europe, he also campaigned to do away with corsets. The dictator called the shaping undergarment “the implement of detestable coquetry which not only betrays a frivolous bent but forecasts the decline of humanity.” And while Bonaparte’s female lovers still wore corsets, less hypocritical medical professionals believed the clothing could cause infertility. In fact, C.J. Dickinson, professor emeritus at Wolfson Institute of Preventive Medicine, says that extremely tight clothing may result in endometriosis, causing lesions on the uterus lining. When constricted, these lesions can’t shed during menstruation; internal bleeding may occur, and scar tissue might form.
Some Men Wore Corsets
Women were not the only aristocrats wearing corsets to create more socially acceptable figures. Specifically, in the latter part of the 18th century men wore form-fitting trousers and jackets. Corsets helped gentlemen achieve a smoother silhouette. However, French and Englishmen grew tired of the trend by the middle of the 19th century. Those who continued to wear the undergarments were teased.
Austrian men continued to wear corsets despite changing fashion standards in the rest of Europe, though. One English gentleman who attended an elite Austrian boarding school noted in the 1867 issue of The Englishwoman’s Domestic Magazine:
From personal experience, I beg to express a decided and unqualified approval of corsets. I was early sent to school in Austria where lacing was not considered ridiculous in a gentleman as in England, and I objected in the thoroughly English way. A sturdy [school attendant] was deaf to my remonstrance, and speedily laced me up tightly in a fashionable Viennese corset… It is from no feeling of vanity that I have ever since continued to wear them fro, not caring to incur ridicule, I take good care that my dress shall not betray me…
Some Corsets Contained Pieces Of Metal Or Bone
Corsets date back at least as early as the 16th century. Aristocratic women started to wear bodices reinforced with whale bones and tusks, instead of the original ones made from cloth and silk. Eventually, pieces of wood and metal were added to the front of most corsets to create even more structure. Duchess of Montpeniser Anne Marie Louise d’Orléans actually had a predominantly metal corset decorated with a crown and fleur-de-lis.
Some 19th-Century Medical Professionals Discouraged Tightly Laced Corsets
Not all citizens condoned corsets or the tight lacing that became popular once metal eyelets were added to the undergarments. In fact, Lancet, one of the oldest medical journals, produced a few articles about the dangers of corsets in the 1880s and ’90s. Additionally, The Sacred Heart Review mentioned in 1890:
Tight lacing] cannot be but hurtful… the veriest novice in anatomy understands how by this process almost every important organ is subjected to cramping pressure, its functions interfered with, and its relations to other structures so altered as to render it, even if it were itself competent, a positive source of danger to them.
Moreover, surgeon William Henry Flowers wrote in his 1881 book, Fashion in Deformity, that tightly-laced corsets were just as harmful as skull-shaping and foot-binding.
They Caused Breathing Problems
Women wore corsets for centuries, but when metal eyelets were added to the undergarments in the 1820s and 1830s, tight-lacing became incredibly popular. This trend involved threading the strings of the corset through the eyelets and pulling considerably. Women were able to be tied in without concern of reverting back to their natural forms. This tight-lacing allegedly caused young women to faint, though; their breathing was constricted. When ladies passed out from lack of oxygen, acquaintances loosened their corset laces or stays. Air then flowed more freely into the lungs. Certain analysts believe excessively tight clothing results in heartburn, distension, and varicose veins because of restricted blood flow.
Not All Corsets Were Used To Make The Waist Appear Slimmer
European corsets in the beginning of the 16th century created an exceedingly recognizable form. An aristocratic woman’s bosom was pushed upward when she wore the shaping garment; her upper half appeared fuller. The torso was shaped like a cylinder because of an unyielding material that ran down the front of the corset. The shaping devices looked more like cones in the 17th century, though. Two pieces of fabric with thick boning were combined to make the waist seem even more narrow.
From about 1800 to 1830, corsets were more forgiving. Women’s stomachs were left unbridled. The undergarments were smaller and more like 21st century bras.
Corsets Changed Shape As Different Monarchs Took The Crown
The Victorian era began when Queen Victoria took the crown in 1837. Corsets during her reign, once again, restricted the belly. Hourglass figures were incredibly popular, so even longer restrictive undergarments were necessary, extending past the natural waist. Steel boning helped create the shape. To make ladies appear appear more shapely, Victorian fashion called for tops with large shoulders and hoop skirts covering multiple layers of crinoline. Clothing designers also began to mass-produce corsets during the Industrial Revolution; people were able to access them more easily.
When Queen Victoria died in 1901, the style changed again. King Edward’s courtiers wore corsets with an “S-bend.” These undergarments typically forced women to tilt forward; their hips and bosom pushed forward, while their backs had an unnatural dip.
Women Who Wore Them Were More Susceptible To Tuberculosis & Pneumonia
Corsets didn’t only cause fainting spells. They also tightly restricted female wearers’ lungs. The vital organs weren’t able to fully expand, making breathing painful. Deep breathing was almost impossible. Additionally, lung conditions, like tuberculosis and pneumonia, could be exacerbated by the corset. Women were especially susceptible to these illnesses before vaccines were invented in the 20th century because their lower lungs were almost constantly bound.
They Caused Back Problems & Women Increasingly Relied On Them
Women who wore corsets for an extended period of time often experienced painful back problems. The boning in the undergarments was mostly immobile; posture remained straight as long as the corsets were laced. This rigidity sometimes led to back and pectoral muscle atrophy, though. The tissue just wasted away. As a result, some corset wearers were forced to rely on their corsets to stay upright.
Women May Have Started Wearing Them As Early As 2000 BCE
In the late 19th century, British archaeologist Sir Arthur Evans discovered a Cretan figure dating back to about 2000 BCE. The sculpture depicted a topless woman with an extremely small waist that looked to be cinched by a belt. Ancient Greeks also wrote about women’s undergarments which made waists tiny and perhaps flattened the bosom.
Corsets Fell Out Of Fashion In The 1920s
Fashion trends change frequently. And by the 1920s, nobody wanted to wear corsets anymore. Flapper dresses came into style. The more forgiving clothing gave wearers a more androgynous look. An hourglass figure was no longer the epitome of feminine beauty. The women who wore these flowing garments were usually young and single. Many of them held jobs during the day and partied at night. In addition to removing their corsets, flappers also chopped off the long locks that were characteristic of Victorian women.
In 1999, both “Blue” magazine (the “Bucking the Condomocracy” article, also reprinted in “Out” Magazine, July 1999, Vol. 7, No.12), and “HQ” magazine (“They Shoot Barebackers Don’t They?”) published articles on barebacking, the one in “HQ” being a reprint of an article from “Poz” magazine. The latter caused a bit of a furore in both “The Sydney Star Observer”, and in the “Sydney Morning Herald”…probably understandably. Read in the context of HIV education and safe sex messages at that time, they read almost as a promotion of barebacking.
I was writing regularly for “Talkabout” magazine at the time, and was on the magazines working group. When I read both articles, I thought they elicited a response, and started to put an article about it together. However, several things were going on at “Talkabout”’ at that time, most notably was a new editor, and I was unsure of how liberal she was going to allow the writing to be, and secondly was an article I had written about the “Options” Employment Agency, which was operating on Oxford St at the time, supposedly to assist HIV/AIDS people to return to work after surviving AIDS, or to re-educate. I had written an expose of them not really doing much to actually assist people, and using said clients to do unpaid “work experience” in their offices. The editor, in all fairness, had sent the article to them… and their response was to threaten to sue the organisation (PLWHA NSW), the magazine, and myself. It was “Bring it on!” from my perspective, but obviously from the organisations…and funding…perspective, it wasn’t something they wanted..As it turned out, my accusations were accurate (I had been quite outspoken about what was going on there for some time,…and had the written testimony of a number of guys who had personally encountered the rort…and had even had the office manager of Options…whose name escapes me now…invite me into his office, and made veiled threats about what I was saying) and the agency had its funding stopped, and closed down shortly after. The article was published, but was so heavily edited that it lost all its clout. I was very disappointed.
However, this made me a bit dubious about publishing another controversial article, and being unsure about the editors response to this piece, and time then passing, I never completed the article. I have been republishing most of my “Talkabout” articles on my blog over the last couple of years…some re-edited, some not…and came across the original draft for this article. I couldn’t actually remember the content of the magazine articles, so did a bit of googling, and thanking the gods of cyberspace that nothing ever disappears completely in the ethos…I found both original articles. I will now include them in my article, to have a permanent record of them. They both make interesting reading.
About 18 months or so further down the line, and with a different editor, I wrote yet another controversial piece on bug chasing…heavily researched, so unbiased…that was totally pulled from publication by the then “Taljabout” working group. It was with great trepidation that Glenn, the then editor, rang to tell me the decision. He knew how much work had gone into it, and I cannot ever recollect an article, written by a HIV+ man, being pulled from publication before in “Talkabout”. The reasoning: it was a great article, but because “Taljabout” was funded by NSW Community Health, there was a perception that said organisation may have seen it as a “promotion of the act of bug chasing” rather than an expose. I was furious. Bug chasing was being talked about within the HIV community, the whole sex dating mentality of “breed me” was a reality…it was happening! To my thinking…it was as if they were burying their heads in the sand, and pretending this just wasn’t happening! The mentality defied me!
Below is my original article with the articles now included. At the end is letters published regarding the “Bucking the Condomocracy” article, and a more recent article on the same subject. My bug chasing article can be found on this blog simply by searching for “barebacking”.
My, hasn’t the HIV community been blessed this month, with both a quarterly and a bi-monthly magazine taking up the HIV cause. I wish I could think that the sort of hype they give HIV/AIDS is harmless, but unfortunately, after reading through both articles – twice – just to make sure I hadn’t miss a subtle point, my conclusion is not so.
The article in HQ magazine (They Shoot Barebackers, Don’t They?), which has also received publicity via both the Sydney Star Observer, and the Sydney Morning Herald, is a reprint of an article from the American POZ magazine in February 1, 1999. When my partner and myself (also HIV+) read the article earlier this year, we were both quite horrified. It described in quite detailed account the so-called phenomena of ‘barebacking’, a current catch-cry for unsafe sex, especially between HIV positive and HIV negative men. This is supposedly by people who are ‘over’ practising safe sex and using condoms, and desire the thrill of ‘skin-to-skin’ sex. It reports on private parties in the USA for people who wish to indulge in this type of sex, and consider the risks of catching HIV minimal, compared to the joy of unprotected sex. Needless to say, the people who run the parties make sure everyone present signs a disclaimer. Wouldn’t want to get sued by people becoming infected, would we! The phenomena has reached as far as the Internet, where there are advertisements placed by HIV negative people to get HIV positive people to supposedly ‘father’ their own HIV infection. The mere implications of this sort of mentality would be enough to frighten anybody. There are also porn sites promoting galleries of photos with guys barebacking. Make it erotic, and you make it right, or so it would seem.
Of cause, the obvious question to ask is why is this happening? Have we stretched the limits of the practice and promotion of safe sex as far as it can go? Have people become so accepting of HIV that it is no longer considered a dangerous disease? Does the fact that we now have an arsenal of drugs to control HIV infection reducing people’s fear of infection? Do younger people consider the entire AIDS issue as a ‘generational’ thing? Is it just a millennium trend? Considering the current arguments going on around compliance and drug holidays, I don’t think it is feasible to even consider that HIV is either ended, or under control. Ask anyone infected and on drug regimes what they think of this! Ask them how much they enjoy taking the handfuls of pills everyday, and how much they enjoy the side effects of same. Ask them about how secure and comfortable they feel in the knowledge of a possible ten to twenty years with such regimes; always hoping the next generation of drugs is going to be easier on us. A vaccine is still a long way off.
Likewise, I also loved the article in ‘Blue” (“Bucking the Condomocracy”) which hit you in the face with the fabulous attention grabbing statement (in bold font) ‘POST-AIDS’. Now this article isn’t quite as bad as I originally thought. In the context in which it is written, it is in many respects correct. However, it does overlook a major point. If we are living with a ‘Post-AIDS’ mentality, then why are so many people in their mid twenties seroconverting? The article tends to cover the promise given by new treatments, but not the fact that playing down HIV is a dangerous road to take. It is full of trendy language, and as someone who has lived with HIV day in and day out for the last seventeen years, I haven’t heard of any of the expressions mooted by the author. Terms such as a ‘Protease Moment’, ‘vaccine optimism’ and ‘vaccine positive’ (in respect to forth coming language in the vaccine age) are all nice terms, and factually the article is right-there is more emphasis being placed on a preventative vaccine than a therapeutic, but that possibly is still a decade away. The article is, I grant you, full of positive images, which perhaps isn’t so bad in a world where doom and gloom are never far from the headlines. But it does seem to have made it look as though HIV is no longer happening. By being so nicey nicey about HIV, I feel it tends to play down the actual dangers inherent in contracting it. Again, ask anybody HIV positive if the would change sero status if possible, and you would get an almost one hundred percent resounding yes!
I felt, when originally reading the barebacking article earlier this year that it demanded a response, but being in an American magazine, and being a phenomena that I had not heard of occurring here (not, of cause, taking into account the many unsafe sex stories one hears from the saunas and backrooms), I decided to let it lie. The fact that HQ magazine has done a sideline on the Australian reaction to barebacking does not change the fact that, having the subject announced on the front cover is irresponsible journalism, to the extreme. The editor can defend it however she likes, but then she is not working in mainstream HIV/AIDS, and obviously knows very little about the subject, or the implications of the article. Trying to make barebacking a mainstream and fashionable pastime is not funny! An article published by Capital Q the same week as the SSO had its piece on HQ, showed the possible incidence of contracting HIV through unsafe sex. Odds of 120 to 1 (for unsafe anal) may sound good to many people, but considering the sex life of your average horny gay male, that makes the risk of infection from unsafe practices highly likely very early in their lives.
I grant that freedom of the press is a much-nurtured principle, but it can go too far, and the press often plays a major role in influencing people in a particular course of action that they may not otherwise take, and are often paramount in establishing new trends (Desirable, and undesirable). Journalists must stop looking at just headline stories to sell magazines, and consider the implications of what they are publishing.
LETTERS PUBLISHED IN “OUT” MAGAZINE SEPTEMBER 1999, VOL 8, NO. 3 IN RESPONSE TO “BUCKING THE CONDOMOCRACY”.
Barebacking is Dead. Long Live Barebacking!
Leave it to science and rational thinking to ruin a popular sexual taboo.
The “bareback” label for sex without a condom has faded in the age of pre-exposure prophylaxis (PrEP) and U=U. People not living with HIV who are taking PrEP are protecting themselves from transmission, while people living with HIV who have an undetectable viral load are unable to transmit the virus to their sex partners at all. As the very definition of HIV risk is being rearranged, the problematic term “barebacking” is finally being relegated to the dust bins of history.
We all know the nature of taboo. The naughty, furtive longing for something forbidden. As the AIDS pandemic lurched from the murderous 80s into the 90s, sexual behavior among gay men pivoted, from horror at the very thought of sex without a condom to, well, something we just might like to do. Real bad. “Barebacking” instantly became part of the lexicon, spurred by maverick porn producers who capitalized on our carnal desire to have sex without a barrier.
Sex without a barrier. Unprotected sex. Barebacking. Also known as having sex. Ask a straight person.
Gay men have always barebacked, of course (along with every other human being and their parents), certainly before HIV ever showed up and yes, even immediately after. If we all had stopped fucking without barriers we would have halted the HIV epidemic in its tracks. Instead, we kept behaving like human beings, making mistakes or getting horny or saying yes when we should have said no or getting drunk or falling in love or being young and stupid.
And sometime, even in the darkest and deadliest years of the epidemic, to unload inside our partner was an enormous “fuck you” to AIDS. You might not understand the humanity of that choice, the triumph of it, or the search it represented for some kind of spiritual and physical release in the midst of relentless mortality. I guess you had to be there.
Not long after we emerged from the 1990s, shell shocked but ready to rumble openly again now that we were armed with effective medications, a renegade porn star bottom named Dawson collected orgasms in the double digits on video and his flick was so polarizing that it was banned in gay video stores. Today, his exploits seem positively quaint, and those same video stores and the countless internet sites that followed transformed themselves from featuring a barebacking category to dropping the category and lumping everything together. Sex without condoms in porn is now customary. Condoms are the outlier.
The actual term has lost its wicked luster. These days, you rarely hear your sex partner say, “oh yeah, fuck me bareback, man.” I mean, sure I will, dude. Yawn.
And gone, too, hopefully, is the judgment of those who labeled barebacking a deviant, destructive pathology. This may be the most painful aspect of our prevention legacy; the rush to demonize those who admitted to having sex without condoms before it became agreeable again, not to mention the furor over those of us who have spoken empathetically about sex without a barrier.
Activist and writer Tony Valenzuela became a community pariah when he wrote a piece in 1995 about being a young man living with HIV who had condomless sex with his boyfriend. He thumbed his nose at his detractors when he appeared naked on a horse for an infamous 1999 POZ Magazine cover (“They Shoot Barebackers, Don’t They?”) in which he discussed how the controversy angered and confused him. Valenzuela’s personal character was questioned and his professional life was derailed for years.
The late social anthropologist and author Eric Rofes (Reviving the Tribe) nearly caused a riot at a 1996 Atlanta town hall event for gay men when he discussed the spiritual and emotional value of sharing semen with a partner. And even as recently as 2013, my essay, “Your Mother Liked It Bareback,” produced one apoplectic comment, among many others, that remains the pinnacle of my blog infamy. “You,” it said, “are a vile merchant of death.”
Maybe, with our new biomedical tools of HIV prevention, those same people who once blindly damned sexual behaviors they didn’t understand — whether out of puritanical beliefs or their fear of their own desires – have reconciled their fantasies and their HIV risk. I hope they’re enjoying totally hot sex and the fluids are flying.
It is difficult to ignore the appalling homophobia, internalized and otherwise, that runs through this aspect of HIV prevention history. We held ourselves as gay men to a more grueling standard than the countless non-queers who get an STI (several of them life-threatening) or an unplanned pregnancy every year.
I have no illusions. Sexually transmitted infections continue, even if the very thought of gonorrhea just makes me feel nostalgic. The PrEP train hasn’t reached everyone who might benefit from it and there is misinformation about its efficacy and side effects. Meanwhile, nearly half of those living with HIV in the United States have not reached viral suppression. There is still reason to be cautious about the who and the when and the how of sex. Now, as ever, we are responsible for our own bodies and the risks we take.
Frankly, behavioral change has not served us well in the grand scheme of HIV prevention. There has always been some debate, tension even, between those who believed the answer to HIV infections is behavior modification, and those who welcome the advent of biomedical interventions such as PrEP and “treatment as prevention” (TasP) that don’t rely upon sexual behavioral choices to work.
Throughout the decades, we have all witnessed the dominant, primal pull that sexual desire has exhibited over caution, so I know which prevention strategy my money is on. But hey, to each his own strategy. For that matter, condoms are a golden oldie and a perfectly legitimate choice. You do you.
What has changed are the conversations and information gathering that happen between partners. PrEP, medications, who is undetectable or not, what sexual positioning in what combination will occur, all of these exist in a more informed landscape, at least among gay men in this country.
Barebacking, as an urban phrase and a taboo, is dead. Thank god and good riddance to this divisive bit of sexual branding. Sex, meanwhile, motors happily onward, unbothered by the judgments of man.
Eben Byers was an amateur golfer, an alumnus of Yale, and a notorious ladies man, but he is most famous for literally rotting from the inside out after spending three years drinking radium-infused water.
When Byers fell and hurt his arm in 1927, he was prescribed Radithor, a radium-infused elixir sold by a quack doctor named William Bailey. Radithor was supposed to alleviate aches, pains, and even invigorate one sexually. Yet what happened to Byers fell far afield of the positive effects Radithor was supposed to have. Instead, after three years of incessant use, Byers began rotting from the inside. His teeth fell out; his jaw had to be removed; holes formed in his brain and skull; and he eventually perished in 1932 from radium poisoning. Like the ill-fated Radium Girls before him, Byers demonstrated the clear and unequivocal bodily evidence that exposure to radium was lethal.
Byers’s tragic death is a story of medical deception and overdose, and it serves as a cautionary tale that there is, in fact, too much of a good thing – especially if that good thing is actually completely lethal.
“The Radium Water Worked Fine Until His Jaw Came Off”
This was the title of a Wall Street Journal article that came out some time after Byers’s passing, succinctly summing up what happened to him. In 1927, Byers was on a train returning from a Harvard-Yale football game when he fell from his bunk and hurt his arm. The pain didn’t go away, so Byers’s doctor prescribed him Radithor.
Radithor was simply radium dissolved in water, marketed as a healing tonic. At a time when radium-infused products were very popular, it was unsurprising that Byers was more than happy to take Radithor. In fact, Byers was so keen on the product and its supposed benefits that he ended up drinking three bottles every day for two years, until the poison caught up with him and began dissolving him from the inside out.
William Bailey, The Man Who Prescribed Byers Radithor, Was A Known Fraud
William J.A. Bailey wasn’t a doctor, even though he claimed to be. He was a Harvard dropout who got rich quick after developing Radithor, a toxic solution of radium dissolved in water. He was a fraud who was repeatedly in trouble with the law and profited off numerous short-lived business start-ups.
The FDA shut down Bailey’s business, but Bailey had already done his damage. The amount of people who perished from Radithor is unknown, but he sold approximately 400,000 bottles of the tonic – 1,400 of which Byers himself purchased.
Byers Probably Took Radithor To Help His Performance In The Bedroom
The quick story is that Byers fell on a train, hurt his arm, took Radithor, and thought it made him better so he kept taking it. There is, though, perhaps another reason Byers was so enthusiastic about Radithor, to the point where he reportedly even gave cases of the stuff to his girlfriends and his race horses.
Byers had a reputation as a ladies’ man. At Yale, his nickname was “Foxy Grandpa.” His fall on the train reportedly injured not only his arm, but also his game. Byers complained of a sort of “run-down feeling” that affected his athletic and sexual performance. That’s when Byers discovered a product on the market that claimed to solve all of these issues. The sexually reparative nature of Radithor was only rumored, but it is unsurprising that a man entering his 50s with a reputation for being popular with women would seek out anything to help him maintain his “Foxy Grandpa” status.
Byers’s Horrific Death Ended The American Public’s Romance With Radium-Infused Products
The problem with touting radioactivity as curative was that it simply wasn’t true. Luckily, most of these quack elixirs were phony, and contained no radium at all (of course, this was not the case with Radithor). Still, there were myriad products on the market meant to be extremely good for you – there were radium-infused beauty creams, toothpastes, soaps, bars of chocolates – you name it.
The American public had an obsession with radium in the 1920s and ’30s that only faded after Byers’s passing brought the real dangers of radium to light.
Byers’s Story Probably Got So Much Attention Because He Was A Handsome, Upper-Class Man
Eben Byers was the son of a well-known entrepreneur, and he was the chairman of his father’s steel company. He attended Yale, golfed, raced horses, and was popular with women. He was the perfect candidate for a tragic, newsworthy story – made even more fascinating and terrifying because he perished after drinking what was touted as a health tonic, completely available to the public. Everything about Byers’s story differs from the devastating story of the Radium Girls.
The tragedy of the Radium Girls – female factory employees who became painfully sick and perished of radium poisoning – was well covered by the media, but was less compelling to the government than the story of Byers, a socialite in the public eye. It wasn’t until Byers told the Federal Trade Commission about Radithor, while on his deathbed, that radium was removed from the federally approved list of medicines.
The Idea To Drink Radioactive Water As An Elixir Came From The Restorative Powers Of Hot Springs
In the 1920s, people knew about – and believed in – the healing powers of hot springs. When it was discovered that the water in hot springs was mildly radioactive, due to the radon gas dissolved in the water, it was concluded that it was the radioactivity that was so curative. In The American Journal of Clinical Medicine, Dr. C.G. Davis claimed, “Radioactivity prevents insanity, rouses noble emotions, retards old age, and creates a splendid youthful joyous life.” It was no wonder products infused with radium, such as candy, hair tonics, and even blankets, were so popular.
However, radon gas is entirely different from radium, the element found in Radithor. Radon gas has a half-life of about three days – radium has one of 1,600 years. Seeing as Byers took three times the already toxic dose of Radithor, he was irrevocably doomed.
Byers Deteriorated Rapidly and Painfully, But He Kept Drinking Radithor
For the first two years Byers took Radithor, he was so pleased with the supposed results that he took three times the suggested daily dose. But, after a while, he began feeling sick. He lost weight, had headaches, and had a blinding pain in his jaw. He had been diagnosed with inflamed sinuses, but once his teeth began to fall out and his jaw began to crumble, Byers knew something was terribly wrong. Byers’s X-ray was sent to a radiologist, who confirmed that Byers’s fate was inevitable – he had the same lesions on his jaw as the Radium Girls. Sadly, Byers was so indoctrinated to rely on Radithor that he kept drinking it, hoping it would help him feel better when he began feeling sick.
An attorney dispatched to Byers’s house shortly before his passing remembered the state Byers was in due to his radiation poisoning:
We went to Southampton where Byers had a magnificent home. There we discovered him in a condition which beggars description. Young in years and mentally alert, he could hardly speak. His head was swathed in bandages. He had undergone two successful jaw operations and his whole upper jaw, excepting two front teeth, and most of his lower jaw had been removed. All the remaining bone tissue of his body was slowly disintegrating, and holes were actually forming in his skull.
Byers Had Enough Radium In His Body To “Kill Three Men”
After Byers’s death, Popular Science Monthly wrote that Byers had the “largest amount of radium ever found in a human being – more than thirty micrograms, enough to kill three men.”
With symptoms such as blinding headaches, breaking bones, and a disintegrating jaw, Byers must have suffered immensely before he succumbed to radiation poisoning in 1932, five years after his first dose of Radithor.
The Federal Trade Commission Accidentally Contributed To The Rise Of Radioactive Products
Back when radium was immensely popular in consumer products, the FDA had very little power to regulate it. Not falling under food or drugs, it was out of their jurisdiction.
There was one department, however, that had control over radium: the Federal Trade Commission. Their job was to stand against false advertising claims; this meant that the FTC worked very hard to make sure that all the products on the market actually contained radium. Their strict regulation ensured that all the products people were buying were genuinely radioactive.
Byers’s Demise Led To Stricter FDA Control
As Byers fell ill, and it became clear Radithor was the culprit, the FTC opened an investigation. They sought to challenge Bailey’s claim that Radithor and other products like it were “harmless.” They wanted Byers to testify, but he was too sick. They dispatched an attorney to his home to take a statement, which is when the attorney found him literally rotting from the inside out. It didn’t take long after that for the FTC to shut down Bailey’s business.
The results of the FTC’s investigation led to the FDA getting more power over investigating suspicious health claims. Eventually, the FDA gained control over the entire pharmaceutical industry.
Bailey Succumbed To His Own Lies, But It Was Only Discovered After His Death
Until the end, Bailey denied Radithor had anything to do with Byers’s demise. He claimed he had drunk more Radithor than Byers himself, and he was living proof that his “healing tonic” was perfectly safe.
Yet when his body was exhumed 20 years after his death from bladder cancer, medical researchers discovered his remains were riddled with radiation. His corpse was described as “still hot” after being unearthed.
Marie Curie and her husband, Pierre, discovered radium, a radioactive element, back in 1898. However, people didn’t realize how dangerous the element was, and they began to use radium in household items. This led to radium in makeup, as well as in medical devices that claimed to cure everything from impotence to arthritis. However, what these quack devices actually led to was a plethora of surprisingly poisonous things, such as toothpaste, hair tonic, and suppositories.
When people began dying of mysterious diseases, such as the ones suffered by the Radium Girls, who painted luminous watch dials with Undark, a radium-based paint that they wound up ingesting via their paintbrushes, doctors finally realized that radium was dangerous.
The history of radium poisoning is full of odd devices designed to improve one’s health and outer appearance. These everyday poisons were sold through magazine and newspaper ads – and in regular pharmacies. Thankfully, by the beginning of World War II, they had been phased out and are now an odd anecdote from American history.
Radium-Lined Cups Were Used To Make Radioactive Beverages
These days, people drink bottled or filtered water. Back in the early 20th century, those who could afford it drank radioactive water. One popular way of making this water, which supposedly could cure many different ailments, involved the use of a metal cup or container that was lined with radium. Any water poured into the vessel was exposed to the radioactive material and picked up its properties. The Revigator was one such device; its makers claimed that it contained radon. Of course, this only “worked” if the device actually contained radium – many of the “radioactive” medical marvels on the market were scams.
People Submerged Themselves In Radium-Laced Water At Spas
Going to spas and spending some time submerged in radioactive water was supposed to be an invigorating experience. In actuality, the natural radiation in these mineral hot springs might have made the spa goers feel relaxed – that is, until a few decades later when they realized that the “hot” water did more harm than good. During the time period, however, even reputable medical journals touted the healing abilities of radium and similar materials, and some claimed radium hot springs were a literal fountain of youth that could help slow the aging process. Some radium-filled hot springs are still in business today, but they limit people’s exposure to any radioactive elements in the water.
Laying In Radioactive Sand Was A Treatment For Arthritis
One of the main byproducts of radium manufacturing is a fine-grained sand that is, of course, highly radioactive. Back in the early 1900s, before people realized how harmful exposure to it was, they claimed that exposure to the sand could successfully treat arthritis pain. Many spas opened up rooms where people could sit and rest their feet on the sand in the hopes of being cured. The ironic thing is that, even though people knew of the dangers that radioactivity could pose, these “Uranium Sitting Houses” were in business up through the 1950s.
Men Placed Wax Coated Radium Rods In Their Urethras As A Cure For Impotence
Men have always struggled with impotence. Now, there are medications like Viagra; back in the early 1900s, there were “bougies.” These were radium-laced wax rods that men inserted directly into their urethras to treat impotence. This treatment is now cringe inducing not only because of the way it took place, but also because placing radioactive material close to reproductive organs is a very bad idea.
Radium Toothpaste Claimed To Make Teeth White And Shiny
Radium wasn’t just used in medical devices – it made its way into everyday beauty and household products as well. One of these hygienic products was toothpaste. According to ads, a small amount of radium in the toothpaste promised to make users’ teeth very white and super shiny. Whether or not it worked is up for debate, but what is known is that radioactive exposure can actually make one’s teeth fall out and result in a jaw rotting from the inside out.
Radithor Supposedly Cured Impotence And Other Health-Related Woes
Radithor was a radium- and thorium-laced water that was sold in small vials. A few drops of it a day could cure impotence and “restore vigor” – or, so it purported to be the case. The product was made by Bailey Radium Laboratories of East Orange, New Jersey, who actually encouraged users to disprove its claims of containing the radioactive substances. The product was removed from the market after one heavy user who reportedly went through around three vials a day of the stuff, playboy Eben Byers, died a horrific death when his jaw disintegrated.
Radium Suppositories Restored People’s “Vigor”
Speaking of restoring “vigor,” how about a radium suppository? These small, radioactive pellets were sold in boxes and claimed to help men with their impotence issues. Made by several different companies, including the Vital-O-Gland Company and the General Remedies Company, there is no proof that the suppositories actually contained any radioactive material, or that they worked as they were supposed to. Thank goodness.
Glasses With Radioactive Lenses Corrected Vision Problems
Before there was laser eye surgery, there were Dengen’s Radio-Active Eye Applicators. This device looked like a pair of simple spectacles, only instead of lenses, it had opaque pods that contained radium and other radioactive materials. Not only could they cure your eye ailments, claiming to repair things like nearsightedness and farsightedness, but they also took care of headaches and eye strain. What’s even more disturbing is the fact the eye applicators came in three different strengths.
Tho-Radia Cosmetics Claimed To Brighten Skin
Tho-radia was a line of makeup and skin creams that contained radium. It was heavily marketed to women in the United States and France, who purchased the items in the hopes that the product’s claims – to rejuvenate and brighten skin – were true. To add a little extra cachet to the brand, its creator, Dr. Alfred Curie (no relation to Marie and Pierre Curie) put his name on the ads.
Radium Emanation Bath Salts Cured Insomnia
Radium bath salts, which worked like modern-day bath salts – as in you dissolve them in your bath water before soaking in them – were sold as a cure for insomnia, various nervous disorders, and even rheumatism. What made them even worse (from a modern perspective, of course) was the fact that dissolving the radioactive bath salts would send small particles of them into the air, where they were also breathed into the lungs. These products were made by several different manufacturers, including the Denver Radium Service on what is now a Superfund site.
Endocrine Glands Were Regulated With The Radiendocrinator
The endocrine system regulates the body’s hormone production. The glands in the endocrine system include those in the neck – the thyroid – as well as the pituitary gland in the brain. However, the horrifying detail here involves the glands that men would treat with the Radiendocrinator – their testes. Treatment via the Radiendocrinator involved holding the device in place sometimes for hours at a time, with the handy (and included) strap that resembled an athletic supporter. Ironically, the creator of the device, William J.A. Bailey, died of radiation-induced bladder cancer in 1949.
Gout And Neuralgia Were Taken Care Of With Radium Tablets
Radium tablets are still a legitimate medical treatment for people suffering from various types of cancer. However, back in the late 19th century, these tablets were sold on pharmacy shelves and supposedly cured gout, neuralgia (stabbing nerve pain), and numerous other ailments. These radioactive tablets, sold under brand names like Arium and Radione, were taken daily by people who simply wanted to feel better or have “the strength of iron.”
Radioactive Heating Pads Cured A Number Of Ailments
A radioactive heating pad that was lined with radium claimed to cure everything from rheumatism to standard aches and pains. The instructions for this particular device include warming it up, keeping it dry, and then applying it to the area of the body that hurts. Users could supposedly leave it on for up to 12 hours a day, and they were even encouraged to roll it up around a painful body part, such as an ankle, and tie it into place.
Uranium Blankets Helped With Arthritis P
These days, uranium blankets are a part of nuclear reactors, and they aren’t even a little bit related to the therapeutic ones touted as cures for arthritis pain in the early 20th century. Those particular blankets looked like standard, quilted ones, only, within the fabric squares, were bits of uranium. These blankets were sold as cures up through the 1950s, even after the dangers of uranium exposure were well known.
Radium Tonic Prevented Gray Hairs
A product called Caradium was created in the early 1900s. It was a tonic that was applied to hair to prevent gray hairs from growing, thanks to the power of its active ingredient – radium. It also promised to make any current gray hairs revert back to their old color. Caradium was the invention of Frederick Godfrey, a man whose credentials included “hair specialist.”
Vintage Shoe-Fitting X-Ray Machines Will Zap Your Feet
How do you tell if a shoe is a good fit? Take a short walk? Squeeze the front-end with your fingers to make sure there is space for your toes? What about a dangerous, 20-second blast of unshielded x-rays? If you were buying shoes in the 1930s, 1940s and 1950s, it’s likely that you regularly inserted a tootsie into one of these death-rays.
The wooden cabinets, possibly first built by a Clarence Karrer in Milwaukee in 1924, had the x-ray source in the base, and it would fire upwards through your foot and shoe. Due to a lack of any kind of shielding, it wouldn’t stop there: the radiation would shoot right up into your baby-maker, clearly a perilous occurrence.
The machine, called a “Shoe-Fitting Fluoroscope” put out 50 kv from its x-ray tube, which – according to Wikipedia’s figures for today’s machines, isn’t too bad:
In medical radiography voltage from 20 kV in mammography up to 150 kV for chest radiography are used for diagnostic. Energy can go up to 250 kV for radiotherapy applications.
The problem was repeat exposure. While it was recommended that children not be subjected to more than 12 doses a year, there was no such luck for shoe-store employees. According to the article Shoe-fitting with x-ray in National Safety News 62 by H. Bavley (1950), store clerks would put their hands into the beam to squeeze shoes during fitting. Worse still was the fate of a poor shoe model, “who received such a serious radiation burn that her leg had to be amputated.”
Thank God there’s nothing this dangerous around today. Like, you know, full-body backscatter x-ray machines in airports.
Get a reality check on some of the most bizarre rumours about how HIV is transmitted.
There’s only a few ways that you can get HIV but, at Avert, it seems that we’ve heard it all when it comes to the many myths and misconceptions about HIV.
A lot of these stories circulating on the HIV rumour mill are old, outdated and more importantly, misinformed. In fact, many of these myths just keep reinforcing HIV-related stigma and have had a long-lasting and damaging impact on many people’s perceptions about how the virus is spread.
Here we debunk some common urban legends to give you the truth about HIV transmission…
Myth 1: Girl goes to cinema and comes out with HIV
Rumour:During the 1990s, a common myth suggested that discarded needles left by strangers anywhere from gas pump handles to inside your cinema chair were infecting unassuming people with HIV. One such story involved a girl getting an unexpected needle stick injury while reaching down beneath her cinema seat to pick up some popcorn.
Reality:Although HIV transmission is a risk between people whoshare needles for drug use, there has actually never been a recorded case of HIV transmission from a discarded needle. However, if you are concerned that you have received a needle stick injury, you should seek medical advice to get checked up forhepatitis C andB instead.
Myth 2: There’s something wrong with this banana…
Rumour:Pictures of red-pigmented fruit (such as bananas or oranges) still circulate the web even today. They are usually accompanied by warnings not to eat them because they have supposedly been injected with HIV. Similar food-related HIV transmission rumours include tainted ketchup, pizza with toppings of bodily fluids and pineapple vendors accused of deliberately selling contaminated fruit.
Reality: You cannot get HIV from food of any kind, including fruit. Even if HIV contaminated blood did get onto the food you’re eating, the virus doesn’t live long enough outside of a human body for it to be transmittable.
Myth 3: I got a pedicure and HIV from some fish in a shopping centre
Rumour:Getting pedicures from Garra rufa fish – which nibble off dry skin – was once a popular beauty fad. However, many salons offering this service closed as a result of news outlets spreading the rumours that fish in these tanks were spreading blood-borne viruses such as HIV and hepatitis C between consumers.
Reality:HIV stands for Human Immunodeficiency Virus which means transmission of HIV only happens between humans – you can’t get HIV from animals, insects or fish. There are no cases of HIV infection due to the use of fish baths, or as a result of any other water-borne route including the use of swimming pools or spas.
Myth 4: The fizzy drink HIV hoax
Rumour:‘For the next few weeks do not drink any products from Pepsi, as a worker from the company has added his blood contaminated with HIV (AIDS)…’
This SMS message, which was falsely linked to the United Kingdom’s Metropolitan Police service in 2017, suggested that a line worker at Pepsi was secretly contaminating cans of fizzy drink with the virus.
Reality:This message has been circulating the web in different formats since 2004 and is incredibly damaging. Even if there was blood found within the drinks cans, HIV can’t live outside of the body long enough for it to be transmittable.
Myth 5: Teen diagnosed with HIV after getting a hair weave at salon
Rumour:In 2015, a rumour in the US reported that a girl in Georgia had contracted HIV at a hair salon because the needles used to fix the girl’s weave to her scalp were dirty. The girl was supposedly diagnosed a week after her makeover, despite never having had sex or used intravenous drugs.
Reality:This story was later reported to be a work of fiction by its author, but it is worth noting that transmission of HIV from stick injuries even in medical settings is extremely rare. The claim that someone can be diagnosed with HIV a week after exposure is also incorrect – as it can take from two weeks to 3 months for an infection to be detected by modern HIV tests.
Today, if someone is diagnosed with HIV, he or she can choose among 41 drugs that can treat the disease. And there’s a good chance that with the right combination, given at the right time, the drugs can keep HIV levels so low that the person never gets sick.
That wasn’t always the case. It took seven years after HIV was first discovered before the first drug to fight it was approved by the U.S. Food and Drug Administration (FDA). In those first anxious years of the epidemic, millions were infected. Only a few thousand had died at that point, but public health officials were racing to keep that death rate from spiking — the inevitable result if people who tested positive weren’t treated with something.
As it turned out, their first weapon against HIV wasn’t a new compound scientists had to develop from scratch — it was one that was already on the shelf, albeit abandoned. AZT, or azidothymidine, was originally developed in the 1960s by a U.S. researcher as way to thwart cancer; the compound was supposed to insert itself into the DNA of a cancer cell and mess with its ability to replicate and produce more tumor cells. But it didn’t work when it was tested in mice and was put aside.
Two decades later, after AIDS emerged as new infectious disease, the pharmaceutical company Burroughs Wellcome, already known for its antiviral drugs, began a massive test of potential anti-HIV agents, hoping to find anything that might work against this new viral foe. Among the things tested was something called Compound S, a re-made version of the original AZT. When it was throw into a dish with animal cells infected with HIV, it seemed to block the virus’ activity.
The company sent samples to the FDA and the National Cancer Institute, where Dr. Samuel Broder, who headed the agency, realized the significance of the discovery. But simply having a compound that could work against HIV wasn’t enough. In order to make it available to the estimated millions who were infected, researchers had to be sure that it was safe and that it would indeed stop HIV in some way, even if it didn’t cure people of their infection. At the time, such tests, overseen by the FDA, took eight to 10 years.
Patients couldn’t wait that long. Under enormous public pressure, the FDA’s review of AZT was fast tracked — some say at the expense of patients.
Scientists quickly injected AZT into patients. The first goal was to see whether it was safe — and, though it did cause side effects (including severe intestinal problems, damage to the immune system, nausea, vomiting and headaches) it was deemed relatively safe. But they also had to test the compound’s effectiveness. In order to do so, a controversial trial was launched with nearly 300 people who had been diagnosed with AIDS. The plan was to randomly assign the participants to take capsules of the agent or a sugar pill for six months. Neither the doctor nor the patient would know whether they were on the drug or not.
After 16 weeks, Burroughs Wellcome announced that they were stopping the trial because there was strong evidence that the compound appeared to be working. One group had only one death. Even in that short period, the other group had 19. The company reasoned that it wouldn’t be ethical to continue the trial and deprive one group of a potentially life-saving treatment.
Those results — and AZT — were heralded as a “breakthrough” and “the light at the end of the tunnel” by the company, and pushed the FDA approve the first AIDS medication on March 19, 1987, in a record 20 months.
But the study remains controversial. Reports surfaced soon after that the results may have been skewed since doctors weren’t provided with a standard way of treating the other problems associated with AIDS — pneumonia, diarrhea and other symptoms — which makes determining whether the AZT alone was responsible for the dramatic results nearly impossible. For example, some patients received blood transfusions to help their immune systems; introducing new, healthy blood and immune cells could have helped these patients battle the virus better. There were also stories of patients from the 12 centers where the study was conducted pooling their pills, to better the chances that they would get at least some of the drug rather than just placebos.
And there were still plenty of questions left unanswered about the drug when it was approved. How long did the apparent benefits last? Could people who weren’t sick yet still benefit? Did they benefit more than those further along in their disease?
Such uncertainty would not be acceptable with a traditional approval, but the urgent need to have something in hand to fight the growing epidemic forced FDA’s hand. The people in the trial were already pressuring the company and the FDA to simply release the drug — if there were something that worked against HIV, they said, then it was not ethical to withhold it.
The drug’s approval remains controversial to this day, but in a world where treatment options are so far advanced it can be hard to imagine the sense of urgency and the social pressure permeating the medical community at the time. AIDS was an impending wave that was about to crash on the shores of an unsuspecting — and woefully unprepared — populace. Having at least one drug that worked, in however limited a way, was seen as progress.
But even after AZT’s approval, activists and public health officials raised concerns about the price of the drug. At about $8,000 a year (more than $17,000 in today’s dollars) — it was prohibitive to many uninsured patients and AIDS advocates accused Burroughs Wellcome of exploiting an already vulnerable patient population.
In the years since, it’s become clear that no single drug is the answer to fighting HIV. People taking AZT soon began showing rising virus levels — but the virus was no longer the same, having mutated to resist the drug. More drugs were needed, and AIDS advocates criticized the FDA for not moving quickly enough to approve additional medications. And side effects including heart problems, weight issues and more reminded people that anything designed to battle a virus like HIV was toxic.
Today, there are several classes of HIV drugs, each designed to block the virus at specific points in its life cycle. Used in combination, they have the best chance of keeping HIV at bay, lowering the virus’s ability to reproduce and infect, and ultimately, to cause death. These so-called antiretroviral drugs have made it possible for people diagnosed with HIV to live long and relatively healthy lives, as long they continue to take the medications.
And for most of these people, their therapy often still includes AZT.
AIDS HOPES DASHED BY TERRIBLE TRUTH ON AZT
It was the drug that held out hope to people carrying the world’s most feared virus. It had the power to move share prices by millions. What it could not do was help people facing AIDS.
This weekend the truth about AZT is in the open: a comprehensive trial, so big it equals all the other research put together, shows that the drug which dominates AIDS treatment has no effect in delaying the onset of the disease. After all the promise and the profits, AZT has nothing to offer people with HIV.
The findings came in the final report on the Anglo-French Concorde trial, published yesterday in The Lancet. Some 1,749 patients with HIV, but who showed no symptoms, were given either the drug or a placebo. There was no statistical difference in the progress of the two groups: after three years 18% had AIDS or were dead.
The results leave a terrible void for the 12m people worldwide said to be infected with the virus, and crush any remaining hopes that AZT might delay the onset of symptoms. They also raise questions as to how those hopes were fuelled in the first place.
Doubts about AZT were first revealed by The Sunday Times five years ago. A painstaking investigation showed that AZT had been rushed to market on the back of a flawed study that was supposed to demonstrate its effectiveness.
The American Food and Drug Administration (FDA), responsible for protecting the public from risk, had been aware of flaws in the trial, but gave AZT approval. Documents obtained under the American Freedom of Information Act showed that records compiled during the trial had been altered, giving the drug a more favourable record; “multiple deviations” from the terms of the study had occurred; and FDA investigators had argued for data from one centre to be dropped entirely from the results. A senior FDA official believed AZT should not be granted a licence, but was overruled.
The doubts did nothing to inhibit Wellcome, AZT’s maker, from promoting its drug. Patients with HIV, but without AIDS symptoms, were the new target. They are worth more money because there are more of them and because they have longer to live.
To show the drug’s usefulness to this lucrative group, Wellcome trumpeted a big American trial called Protocol 019. The trial was halted in August 1989, after less than two years, on the grounds that it had already shown such benefit to HIV-positive people it would be unethical not to give the drug to all who wanted it.
Such “benefit” was judged only by time free from disease. A new analysis of the trial data, however, reaches a similar conclusion to Concorde: that AZT is essentially useless.
The original results were announced with a fanfare by the National Institute of Allergy and Infectious Diseases, which sponsored it with Wellcome’s support. In London, The Independent newspaper gave its front page to the findings, under the headline “AIDS drug offers lease of life”.
The very different picture painted by last month’s analysis, in the New England Journal of Medicine, comes after investigators paid more attention to the drug’s side-effects. These can include anaemia, liver damage, fatigue, nausea, headaches and sometimes a collapse in white blood cells, making patients more prone to disease.
The researchers looked at the average time patients experienced neither a progression of disease nor an adverse effect. Those treated with low doses of AZT were found to suffer a reduction in quality of life “due to severe side-effects of therapy” that approximately equalled any benefit from slowing down the disease; people on higher doses suffered even greater side-effects, outweighing the supposed benefit.
Dr Peter Duesberg, the American virus expert who has claimed for years that AZT is not a rational therapy, says it is clear that the original claims were completely ill-founded. “The opposite interpretations of the same data lead me to conclude that those responsible are not acting as scientists; they are acting as politicians.
“When the time is ripe to say that AZT is detrimental, that it actually hurts, the interpretation will change again.”
For patients with AIDS-related symptoms, AZT will continue to be prescribed: the consensus remains that it gives a temporary benefit.
For those without symptoms, hope centres on combinations of drugs, or on other approaches such as gene therapy. However, Professor Ian Weller, of the Middlesex hospital in London, who was the principal British investigator in the Concorde trial, is alarmed by the drive to give AIDS patients an AZT drug cocktail as if it were already an established therapy.
“There’s a suspicion of more toxicity if you combine it with other treatment, and we are a long way from showing an important clinical benefit, or that it is safer than AZT on its own,” he said. “There are physicians who are jumping the gun.”
As late as Thursday, Wellcome was insisting that AZT “remains the best weapon we have to slow the progress of the disease”. Dr Trevor Jones, its research director, said: “The question is where in the course of the disease you begin.” *
AIDS and the AZT Scandal: SPIN’s 1989 Feature, ‘Sins of Omission’
The story of AZT, one of the most toxic, expensive, and controversial drugs in the history of medicine
At the end of 1989, two years after we had started the highly controversial AIDS column in SPIN, we published an article by Celia Farber called “Sins of Omission” about the truly bad and corrupt science surrounding promoting AZT as a treatment for the syndrome of diseases.
Celia was the editor and frequent writer of the column and unearthed hard evidence of the cold-bloodedness of the AIDS establishment pushing a drug that was worse than the disease, and killed faster than the natural progression of AIDS left untreated. AZT had been an abandoned cancer drug, discarded because of it’s fatal toxicity, resurrected in the cynical belief that AIDS patients were going to die anyway, so trying it out was sort of like playing with the house’s money. Because the drug didn’t require the usual massively expensive research and trial processes, having gone through that years earlier, it was insanely profitable for its maker, Burroughs Wellcome. It was a tragically perfect storm of windfall profits, something to pacify AIDS activists and the media, and a convenient boom to the patent holders for HIV testing.
Celia — who should get the Congressional Medal of Honor for her brave and relentless reporting, here and throughout the ten years we ran the column — exposed the worthlessness of the drug, the shady studies and deals to suppress the negative findings, and its awful and final consequences. This piece very literally changed the media’s view of AIDS and sharpened their discerning and skeptical eye. And soon after, AZT was once again shelved, hopefully this time forever.
Many times over the years since, people have come up to me and said that reading this article saved their lives, that they either stopped taking the drug and their health improved vastly, or they never took it because of what we reported. Nothing ever made me prouder.
— Bob Guccione Jr., founder of SPIN, October 3, 2015
[This story was originally published in the November 1989 issue of SPIN. In honor of SPIN’s 30th anniversary, we’ve republished this piece as part of our ongoing “30 Years, 30 Stories” series.]
On a cold January day in 1987, inside one of the brightly-lit meeting rooms of the monstrous FDA building, a panel of 11 top AIDS doctors pondered a very difficult decision. They had been asked by the FDA to consider giving lightning-quick approval to a highly toxic drug about which there was very little information. Clinically called Zidovudine, but nicknamed AZT after its components, the drug was said to have shown a dramatic effect on the survival of AIDS patients. The study that had brought the panel together had set the medical community abuzz. It was the first flicker of hope — people were dying much faster on the placebo than on the drug.
But there were tremendous concerns about the new drug. It had actually been developed a quarter of a century earlier as a cancer chemotherapy, but was shelved and forgotten because it was so toxic, very expensive to produce, and totally ineffective against cancer. Powerful, but unspecific, the drug was not selective in its cell destruction.
Drug companies around the world were sifting through hundreds of compounds in the race to find a cure, or at least a treatment, for AIDS. Burroughs Wellcome, a subsidiary of Wellcome, a British drug company, emerged as the winner. By chance, they sent the failed cancer drug, then known as Compound S, to the National Cancer Institute along with many others to see if it could slay the AIDS dragon, HIV. In the test tube at least, it did. At the meeting, there was a lot of uncertainty and discomfort with AZT. The doctors who had been consulted knew that the study was flawed and that the long-range effects were completely unknown. But the public was almost literally baying at the door. Understandably, there was immense pressure on the FDA to approve AZT, considering the climate of fear and anger all around.*
Everybody was worried about this one. To approve it, said Ellen Cooper, an FDA director, would represent a “significant and potentially dangerous departure from our normal toxicology requirements.” Just before approving the drug, one doctor on the panel, Calvin Kunin, summed up their dilemma. “On the one hand,” he said, “to deny a drug which decreases mortality in a population such as this would be inappropriate. On the other hand, to use this drug widely, for areas where efficacy has not been demonstrated, with a potentially toxic agent, might be disastrous.”
“We do not know what will happen a year from now,” said panel chairman Dr. Itzhak Brook. “The data is just too premature, and the statistics are not really well done. The drug could actually be detrimental.” A little later, he said he was also “struck by the fact that AZT does not stop deaths. Even those who were switched to AZT still kept dying.”
“I agree with you,” answered another panel member, “there are so many unknowns. Once a drug is approved, there is no telling how it could be abused. There’s no going back.” Burroughs Wellcome reassured the panel that they would provide detailed two-year follow-up data, and that they would not let the drug get out of its intended parameters: as a stopgap measure for very sick patients.
Dr. Brook was not won over by the promise. “If we approve it today, there will not be much data. There will be a promise of data,” he predicted, “but then the production of data will be hampered.” Brook’s vote was the only one cast against approval.
“There was not enough data, not enough follow-up,” Brook recalls. “Many of the questions we asked the company were answered by, ‘We have not analyzed the data yet,’ or, ‘We do not know.’ I felt that there was some promising data, but was very worried about the price being paid for it. The side effects were so very severe. It was chemotherapy. Patients were going to need blood transfusions, that’s very serious.”
“The committee was tending to agree with me,” says Brook, “that we should wait a little bit, be more cautious. But once the FDA realized we were intending to reject it, they applied political pressure. At about 4 p.m., the head of the FDA’s Center for Drugs and Biologics asked permission to speak, which is extremely unusual. Usually they leave us alone. But he said to us, ‘Look, if you approve the drug, we can assure you that we will work together with Burroughs Wellcome and make sure the drug is given to the right people.’ It was like saying ‘please do it.’”
Brad Stone, FDA press officer, was at that meeting. He says he doesn’t recall that particular speech, but that there is nothing “unusual” about FDA officials making such speeches at advisory meetings. “There was no political pressure,” he says. “The people in that meeting approved the drug because the data the company had produced proved it was prolonging life. Sure it was toxic, but they concluded that the benefits clearly outweighed the risks.” The meeting ended. AZT, which several members of the panel still felt uncomfortable with and feared could be a time bomb, was approved.
Flash forward: August 17, 1989. Newspapers across America banner-headlined that AZT had been “proven to be effective in HIV antibody-positive, asymptomatic, and early ARC patients,” even though one of the panel’s main concerns was that the drug should only be used in a last-case scenario for critically-ill AIDS patients, due to the drug’s extreme toxicity. Dr. Anthony Fauci, head of the National Institutes of Health (NIH), was now pushing to expand prescription.
The FDA’s traditional concern had been thrown to the wind. Already the drug had spread to 60 countries and an estimated 20,000 people. Not only had no new evidence allayed the initial concerns of the panel, but the follow-up data, as Dr. Brook predicted, had fallen by the wayside. The beneficial effects of the drug had proven to be temporary. The toxicity, however, stayed the same.
The majority of those in the AIDS-afflicted and medical communities held the drug up as the first breakthrough on AIDS. For better or worse, AZT had been approved faster than any drug in FDA history, and activists considered it a victory. The price paid for the victory, however, was that almost all government drug trials, from then on, focused on AZT — while over 100 other promising drugs were left uninvestigated.
Burroughs Wellcome stock went through the roof when the announcement was made. At a price of $8.000 per patient per year (not including blood-work and transfusions), AZT is the most expensive drug ever marketed. Burroughs Wellcome’s gross profits for next year are estimated at $230 million. Stock market analysts predict that Burroughs Wellcome may be selling as much as $2 billion worth of AZT, under the brand name Retrovir, each year by the mid-1990s — matching Burroughs Wellcome’s total sales for all its products last year.
“Does AZT do anything? Yes, it does. But the evidence that it does something against HIV is really not there.”
AZT is the only antiretroviral drug that has received FDA approval for treatment of AIDS since the epidemic began ten years ago, and the decision to approve it was based on a single study that has long been declared invalid. The study was intended to be a “double-blind placebo-controlled study,” the only kind of study that can effectively prove whether or not a drug works. In such a study, neither patient nor doctor is supposed to know if the patient is getting the drug or a placebo. In the case of AZT, the study became unblinded on all sides, after just a few weeks.
Both sides contributed to the unblinding. It became obvious to doctors who was getting what because AZT causes such severe side effects that AIDS per se does not. Furthermore, a routine blood count known as a CMV, which clearly shows who is on the drug and who is not, wasn’t whited out in the reports. Both of these facts were accepted and confirmed by both the FDA and Burroughs Wellcome, who conducted the study.
Many of the patients who were in the trial admitted that they had analyzed their capsules to find out whether they were getting the drug. If they weren’t, some bought the drug on the underground market. Also, the pills were supposed to be indistinguishable by taste, but they were not. Although this was corrected early on, the damage was already done. There were also reports that patients were pooling pills out of solidarity to each other. The study was so severely flawed that its conclusions must be considered, by the most basic scientific standards, unproven.
The most serious problem with the original study, however, is that it was never completed. Seventeen weeks into the study, when more patients had died in the placebo group, the study was stopped, five months prematurely, for “ethical” reasons: It was considered unethical to keep giving people a placebo when the drug might keep them alive longer. Because the study was stopped short, and all subjects were put on AZT, no scientific study can ever be conducted to prove unequivocally whether AZT does prolong life.
Dr. Brook, who voted against approval, warned at the time that AZT, being the only drug available for doctors to prescribe to AIDS patients, would probably have a runaway effect. Approving it prematurely, he said, would be like “letting the genie out of the bottle.”
Brook pointed out that since the drug is a form of chemotherapy, it should only be prescribed by doctors who have experience with chemotherapeutic drugs. Because of the most severe toxic effect of AZT — cell depletion of the bone marrow —patients would need frequent blood transfusions. As it happened, AZT was rampantly prescribed as soon as it was released, way beyond its purported parameters. The worst-case scenario had come true: Doctors interviewed by the New York Times later in 1987 revealed that they were already giving AZT to healthy people who had tested positive for antibodies to HIV.
The FDA’s function is to weigh a drug’s efficacy against its potential hazards. The equation is simple and obvious: A drug must unquestionably repair more than it damages, otherwise the drug itself may cause more harm than the disease it is supposed to fight. Exactly what many doctors and scientists fear is happening with AZT.
“I personally do not prescribe AZT. I have continued to experience that people live longer who are not on it.”
AZT was singled out among hundreds of compounds when Dr. Sam Broder, the head of the National Cancer Institute (NCI), found that it “inhibited HIV viral replication in vitro.” AIDS is considered a condition of immune suppression caused by the HIV virus replicating and eating its way into T-4 cells, which are essential to the immune system. HIV is a retrovirus which contains an enzyme called reverse transcriptase that converts viral RNA to DNA. AZT was thought to work by interrupting this DNA synthesis, thus stopping further replication of the virus.
While it was always known that the drug was exceedingly toxic, the first study concluded that “the risk/benefit ratio was in favor of the patient.”
In the study that won FDA approval for AZT, the one fact that swayed the panel of judges was that the AZT group outlived the placebo group by what appeared to be a landslide. The ace card of the study, the one that canceled out the issue of the drug’s enormous toxicity, was that 19 persons had died in the placebo group and only one in the AZT group. The AZT recipients were also showing a lower incidence of opportunistic infections.
While this data staggered the panel that approved the drug, other scientists insisted that it meant nothing — because it was so shabbily gathered, and because of the unblinding. Shortly after the study was stopped, the death rate accelerated in the AZT group. “There was no great difference after a while,” says Dr. Brook, “between the treated and the untreated group.”
“That study was so sloppily done that it really didn’t mean much,” says Dr. Joseph Sonnabend, a leading New York City AIDS doctor. Dr. Harvey Bialy, scientific editor of the journal Biotechnology, is stunned by the low quality of science surrounding AIDS research. When asked if he had seen any evidence of the claims made for AZT, that it “prolongs life” in AIDS patients, Bialy said, “No, I have not seen a published study that is rigorously done, analyzed, and objectively reported.”
Bialy, who is also a molecular biologist, is horrified by the widespread use of AZT, not just because it is toxic, but because, he insists, the claims its widespread use are based upon are false. “I can’t see how this drug could be doing anything other than making people very sick,” he says.
The scientific facts about AZT and AIDS are indeed astonishing. Most ironically, the drug has been found to accelerate the very process it was said to prevent: the loss of T-4 cells.
“Undeniably, AZT kills T-4 cells [white blood cells vital to the immune system],” says Bialy. “No one can argue with that. AZT is a chain-terminating nucleotide, which means that it stops DNA replication. It seeks out any cell that is engaged in DNA replication and kills it. The place where most of this replication is taking place is in the bone marrow. That’s why the most common and severe side effect of the drug is bone marrow toxicity. That is why they [patients] need blood transfusions.”
AZT has been aggressively and repeatedly marketed as a drug that prolongs survival in AIDS patients because it stops the HIV virus from replicating and spreading to healthy cells. But, says Bialy: “There is no good evidence that HIV actively replicates in a person with AIDS, and if there isn’t much HIV replication to stop, it’s mostly killing healthy cells.”
University of California at Berkeley scientist Dr. Peter Duesberg drew the same conclusion in a paper published in Proceedings, the journal of the National Academy of Sciences. Duesberg, whose paper addressed his contention that HIV is not a sufficient cause for AIDS, wrote: “Even if HIV were to cause AIDS, it would hardly be a legitimate target for AZT therapy, because in 70 to 100 percent of antibody-positive persons, proviral DNA is not detectable… and its biosynthesis has never been observed.”
As a chemotherapeutic drug, explained Duesberg, AZT “kills dividing blood cells and other cells,” and is thus “directly immunosuppressive.”
“The cell is almost a million-fold bigger target than the virus, so the cell will be much, much more sensitive,” says Duesberg. “Only very few cells, about one in 10,000, are actively making the virus containing DNA, so you must kill incredibly large numbers of cells to inhibit the virus. This kind of treatment could only theoretically help if you have a massive infection, which is not the case with AIDS. Meanwhile, they’re giving this drug that ends up killing millions of lymphocytes [white blood cells]. It’s beyond me how that could possibly be beneficial.”
“It doesn’t really kill them,” Burroughs Wellcome scientist Sandra Lehrman argues. “You don’t necessarily have to destroy the cell, you can just change the function of it. Furthermore, while the early data said that only very few cells were infected, new data says that there may be more cells infected. We have more sensitive detection techniques now.”
“Changes their function? From what — functioning to not functioning? Another example of mediocre science,” says Bialy. “The ‘sensitive detection technique’ to which Dr. Lehrman refers, PCR, is a notoriously unreliable one upon which to base quantitative conclusions.”
When specific questions about the alleged mechanisms of AZT are asked, the answers are long, contradictory, and riddled with unknowns. Every scientific point raised about the drug is eventually answered with the blanket response, “The drug is not perfect, but it’s all we have right now.” About the depletion of T-4 cells and other white cells, Lehrman says, “We don’t know why T-4 cells go up at first, and then go down. That is one of the drug mechanisms that we are trying to understand.”
When promoters of AZT are pressed on key scientific points, whether at the NIH, FDA, Burroughs Wellcome, or an AIDS organization, they often become angry. The idea that the drug is “doing something,” even though this is invariably followed with irritable admissions that there are “mechanisms about the drug and disease we don’t understand,” is desperately clung to. It is as if, in the eye of the AIDS storm, the official, government-agency sanctioned position is immunized against critique. Skepticism and challenge, so essential to scientific progress and so prevalent in every other area of scientific endeavor, is not welcome in the AZT debate, where it is arguably needed more than anywhere else.
The results, finally and ironically, are what damns AZT.
The toxic effects of AZT, particularly bone marrow suppression and anemia, are so severe that up to 50 percent of all AIDS and ARC patients cannot tolerate it and have to be taken off it. In the approval letter that Burroughs Wellcome sent to the FDA, all of 50 additional side effects of AZT, aside from the most common ones, were listed. These included: loss of mental acuity, muscle spasms, rectal bleeding, and tremors.
Anemia, one of AZT’s common side effects, is the depletion of red blood cells, and, according to Duesberg, “Red blood cells are the one thing you cannot do without. Without red cells, you cannot pick up ???gen.”
Fred, a person with AIDS, was put on AZT and suffered such severe anemia from the drug he had to be taken off it. In an interview in the AIDS handbook Surviving and Thriving With AIDS, he described what anemia feels like to editor Michael Callen: “I live in a studio and my bathroom is a mere five-step walk from my bed. I would just lie there for two hours; I couldn’t get up to take those five steps. When I was taken to the hospital, I had to have someone come over to dress me. It’s that kind of severe fatigue. The quality of my life was pitiful… I’ve never felt so bad… I stopped the AZT and the mental confusion, the headaches, the pains in the neck, the nausea, all disappeared within a 24-hour period.”
“I feel very good at this point,” Fred went on. “I feel like the quality of my life was a disaster two weeks ago. And it really was causing a great amount of fear in me, to the point where I was taking sleeping pills to calm down. I was so worried. I would totally lose track of what I was saying in the middle of a sentence. I would lose my directions on the street.”
“Many AIDS patients are anemic even before they receive the drug,” says Burroughs Wellcome’s Dr. Lehrman, “because HIV itself can infect the bone marrow and cause anemia.”
This argument betrays a bizarre reasoning. If AIDS patients are already burdened with problems such as immune suppression, bone marrow toxicity, and anemia, is compounding these problems an improvement?
“Yes, AZT is a form of chemotherapy,” says the man who invented the compound a quarter-century ago, Jerome Horwitz. “It is cytotoxic, and as such, it causes bone marrow toxicity and anemia. There are problems with the drug. It’s not perfect. But I don’t think anybody would agree that AZT is of no use. People can holler from now until doomsday that it is toxic, but you have to go with the results.”
The results, finally and ironically, are what damns AZT. Several studies on the clinical effects of AZT — including the one that Burroughs Wellcome’s approval was based on — have drawn the same conclusion: that AZT is effective for a few months, but that its effect drops off sharply after that. Even the original AZT study showed that T-4 cells went up for a while and then plummeted. HIV levels went down, and then came back up. This fact was well-known when the advisory panel voted for approval. As panel member Dr. Stanley Lemon said in the meeting, “I am left with the nagging thought that after seeing several of these slides, that after 16 to 24 weeks — 12 to 16 weeks, I guess — the effect seems to be declining.”
A follow-up meeting, two weeks after the original Burroughs Wellcome study, was scheduled to discuss the long-range effects of AZT and the survival statistics. As one doctor present at that meeting in May 1988 recalls, “They hadn’t followed up the study. Anything that looked beneficial was gone within half a year. All they had were some survival statistics averaging 44 weeks. The p24 didn’t pan out and there no persistent improvement in T-4 cells.”
HIV levels in the blood are measured by an antigen called p24. Burroughs Wellcome made the claim that AZT lowered this level, that is, lowered the amount of HIV in the blood. At the first FDA meeting, Burroughs-Welcome emphasized how the drug had “lowered” the p24 levels; at the follow-up meeting they didn’t even mention it.
As that meeting was winding down, Dr. Michael Lange, head of the AIDS program at St. Luke’s-Roosevelt Hospital in New York spoke up about this. “The claim of AZT is made on the fact that it is supposed to have an antiviral effect,” he said to Burroughs Wellcome, “and on this we have seen no data at all… Since there is a report in the Lancet [a leading British medical journal] that after 20 weeks or so, in many patients p24 came back, do you have any data on that?”
“What counts is the bottom line,” one of the scientists representing Burroughs Wellcome summed up, “the survival, the neurologic function, the absence of progression and the quality of life, all of which are better. Whether you call it better because of some antiviral effect, or some other antibacterial effect, they are still better.”
Dr. Lange suggested that the drug may be effective in the same way a simple anti-inflammatory, such as aspirin, is effective. An inexpensive, nontoxic drug called Indomecithin, he pointed out, might serve the same function, without the devastating side effects.
One leading AIDS researcher, who was part of the FDA approval process, says today: “Does AZT do anything? Yes, it does. But the evidence that it does something against HIV is really not there.”
“There have always been drugs that we use without knowing exactly how they work,” says Nobel Prize winner Walter Gilbert. “The really important thing to look at is the clinical effect. Is the drug helping or isn’t it?”
A physician with extensive experience with AIDS patients who asked to remain anonymous told SPIN, point blank: “I personally do not prescribe AZT. I have continued to experience that people live longer who are not on it.”
“I’m living proof that AZT works,” says one person with ARC on AZT. “I’ve been on it for two years now, and I’m certainly healthier than I was two years ago. It’s not a cure-all, it’s not a perfect drug, but it’s effective. It’s slowing down the progression of the disease.”
“Sometimes I fee like I’m swallowing Drano,” says another. “I mean, sometimes I have problems swallowing. I just don’t like the idea of taking something that foreign to my body. But every six hours, I’ve got to swallow it. Until something better comes along, this is what is available to me.”
“I am absolutely convinced that people enjoy a better quality of life and survive longer who do not take AZT,” says Gene Fedorko, President of Health Education AIDS Liaison (HEAL). “I think it’s horrible the way people are bullied by their doctors to take this drug. We get people coming to us shaking and crying because their doctors said they’ll die if they don’t take AZT. That is an absolute lie.” Fedorko has drawn his conclusion from years of listening to the stories of people struggling to survive AIDS at HEAL’s weekly support group.
“I wouldn’t take AZT if you paid me,” says Michael Callen, cofounder of New York City’s PWA coalition, Community Research Initiative, and editor of several AIDS journals. Callen has survived AIDS for over seven years without the help of AZT. “I’ve gotten the s–t kicked out of me for saying this, but I think using AZT is like aiming a thermonuclear warhead at a mosquito. The overwhelming majority of long-term survivors I’ve known have chosen not to take AZT.”
“I’m convinced that if you gave AZT to a perfectly healthy athlete he would be dead in five years.”
The last surviving patient from the original AZT trial, according Burroughs Wellcome, died recently. When he died, he had been on AZT for three and one-half years. He was the longest surviving AZT recipient. The longest surviving AIDS patient overall, not on AZT, has lived for eight and one-half years.
An informal study of long-term survivors of AIDS followed 24 long-term survivors, all of whom had survived AIDS for more than six years. Only one of them had recently begun taking AZT.
In the early days, AZT was said to extend lives. In actual fact, there is simply no solid evidence that AZT prolongs life.
“I think AZT does prolong life in most people,” says Dr. Bruce Montgomery of the State University of New York at Stony Brook, who is completing a study on AZT. “There are not very many long-term survivors, and we really don’t know why they survive. It could be luck. But most people are not so lucky.”
“AZT does seem to help many patients,” says Dr. Bernard Bahari, a New York City AIDS physician and researcher, “but it’s very hard to determine whether it actually prolongs life.”
“Many of the patients I see choose not to take AZT,” says Dr. Don Abrams of San Francisco General Hospital. “I’ve been impressed that survival and lifespan are increasing for all people with AIDS. I think it has a lot to do with aerosolized Pentamadine [a drug that treats pneumocystis carinii pneumonia]. There’s also the so-called plague effect, the fact that people get stronger and stronger when a disease hits a population. The patients I see today are not as fragile as the early patients were.”
“Whether you live or die with AIDS is a function of how well your doctor treats you, not of AZT,” says Dr. Joseph Sonnabend, one of New York City’s first and most reputable AIDS doctors, whose patients include many long-term survivors, although he has never prescribed AZT. Sonnabend was one of the first to make the simple observation that AIDS patients should be treated for their diseases, not just for their HIV infection.
Several studies have concluded that AZT has no effect on the two most common opportunistic AIDS infections, Pneumocystic Carinii Pneumonia (PCP) and Kaposi’s Sarcoma (KS). The overwhelming majority of AIDS patients die of PCP, for which there has been an effective treatment for decades. This year, the FDA finally approved aerosolized Pentamadine for AIDS. A recent Memorial Sloan Kettering study concluded the following: By 15 months, 80 percent of people on AZT not receiving Pentamadine had a recurrent episode of pneumocystis. Only 5 percent of those people who did get Pentamadine had a recurring episode. “All those deaths in the AZT study were treatable,” Sonnabend says. “They weren’t deaths from AIDS, they were deaths from treatable conditions. They didn’t even do any autopsies for that study. What kind of faith can one have in these people?”
“If there’s one resistance to AZT in the general public at all, it’s within the gay community of New York,” says the doctor close to the FDA approval, who asked to remain anonymous. “The rest of this country has been brainwashed into thinking this drug really does that much. The data has all been manipulated by people who have a lot vested in AZT.”
“If AIDS were not the popular disease that it is — the money-making and career-making machine — these people could not get away with this kind of shoddy science,” says Bialy. “In all my years in science I have never seen anything this atrocious.” When asked if he thought it was at all possible that people have been killed as a result of AZT poisoning rather than AIDS he answered: “It’s more than possible.”
August 17, 1989: The government has announced that 1.4 million healthy, HIV antibody-positive Americans could “benefit” from taking AZT, even though they show no symptoms of disease. New studies have “proven” that AZT is effective in stopping the progression of AIDS in asymptomatic and early ARC cases. Dr. Fauci, the head of NIH, proudly announced that a trial has been going on for “two years” had “clearly shown” that early intervention will keep AIDS at bay. Anyone who has antibodies to HIV and less than 500 T-4 cells should start taking AZT at once, he said. That is approximately 650,000 people. 1.4 million Americans are assumed HIV antibody-positive, and eventually all of them may need to take AZT so they don’t get sick, Fauci contended.
The leading newspapers didn’t seem to think it unusual that there was no existing copy of the study, but rather a breezy two-page press release from the NIH. When SPIN called the NIH asking for a copy of the study, we were told that it was “still being written.”
We asked a few questions about the numbers. According to the press release, 3,200 early ARC and asymptomatic patients were divided into two groups, one AZT and one placebo, and followed for two years. The two groups were distinguished by T-4 cell counts; one group had less than 500, the other more than 500. These two were then divided into three groups each: high-dose AZT, low-dose AZT, and placebo. In the group with more than 500 T-4 cells, AZT had no effect. In the other group, it was concluded that low-dose AZT was the most effective, followed by high-dose. All in all, 36 out of 900 developed AIDS in the two AZT groups combined, and 38 out of 450 in the placebo group. “HIV-positive are twice as likely to get AIDS if they don’t take AZT,” the press declared.
However, the figures are vastly misleading. When we asked how many patients were actually enrolled for a full two years, the NIH said they did not know, but that the average time of participation was one year, not two.
“It’s terribly dishonest the way they portrayed those numbers,” says Dr. Sonnabend. “If there were 60 people in the trial those numbers would mean something, but if you calculate what the percentage is out of 3,200, the difference becomes minute between the two groups. It’s nothing. It’s hit or miss, and they make it look like it’s terribly significant.”
The study boasted that AZT is much more effective and less toxic at one-third the dosage than has been used for three years now. That’s the good news. The bad news is that thousands have already been walloped with 1,500 milligrams of AZT and possibly even died of toxic poisoning — and now we’re hearing that one third of the dose would have done?
With all that remains so uncertain about the effects of AZT, it seems criminal to advocate expanding its usage to healthy people, particularly since only a minuscule percentage of the HIV-infected population have actually developed ARC or AIDS.
Burroughs Wellcome has already launched testing of AZT in asymptomatic hospital workers, pregnant women, and in children, who are getting liquid AZT. The liquid is left over from an aborted trial, and given to the children because they can mix it with water — children don’t like to swallow pills. It has also been proposed that AZT be given to people who do not yet even test positive for HIV antibodies, but are “at risk.”
“I’m convinced that if you gave AZT to a perfectly healthy athlete,” says Fedorko, “he would be dead in five years.”
“This is such shoddy science it’s hard to believe nobody is protesting.”
In December 1988, the Lancet published a study that Burroughs Wellcome and the NIH do not include in their press kits. It was more expansive than the original AZT study and followed patients longer. It was not conducted in the United States, but in France, at the Claude Bernard Hospital in Paris, and concluded the same things about AZT that Burroughs Wellcome’s study did, except Burroughs Wellcome called their results “overwhelmingly positive,” and the French doctors called theirs “disappointing.” The French study found, once again, that AZT was too toxic for most to tolerate, had no lasting effect on HIV blood levels, and left the patients with fewer T-4 cells than they started with. Although they noticed a clinical improvement at first, they concluded that “by six months, these values had returned to their pretreatment levels, and several opportunistic infections, malignancies, and deaths occurred.”
“Thus the benefits of AZT are limited to a few months for ARC and AIDS patients,” the French team concluded. After a few months, the study found, AZT was completely ineffective.
The news that AZT will soon be prescribed to asymptomatic people has left many leading AIDS doctors dumbfounded and furious. Every doctor and scientist I asked felt that it was highly unprofessional and reckless to announce a study with no data to look at, making recommendations with such drastic public health implications. “This simply does not happen,” says Bialy. “The government is reporting scientific facts before they’ve been reviewed? It’s unheard of.”
“It’s beyond belief,” says Dr. Sonnabend in a voice tinged with desperation. “I don’t know what to do. I have to go in and face an office full of people asking for AZT. I’m terrified. I don’t know what to do as a responsible physician. The first study was ridiculous. Margaret Fischl, who has done both of these studies, obviously doesn’t know the first thing about clinical trials. I don’t trust her. Or the others. They’re simply not good enough. We’re being held hostage by second-rate scientists. We let them get away with the first disaster; now they’re doing it again.”
“It’s a momentous decision to say to people, ‘If you’re HIV-positive and your T-4 cells are below 500, start taking AZT,’” says the AIDS doctor who wished to remain anonymous. “I know dozens of people that I’ve seen personally every few months for several years now who have been in that state for more than five years, and have not progressed to any disease.”
“I’m ashamed of my colleagues,” Sonnabend laments. “I’m embarrassed. This is such shoddy science it’s hard to believe nobody is protesting. Damned cowards. The name of the game is to protect your grant, don’t open your mouth. It’s all about money… it’s grounds for just following the party line and not being critical, when there are obviously financial and political forces driving this.”
When Duesberg heard the latest announcement, he was partially stunned over the reaction of Gay Men’s Health Crisis President Richard Dunne, who said that GMHC now urged “everybody to get tested,” and of course those who test positive to go on to AZT. “These people are running into the gas chambers,” says Duesberg. “Himmler would have been so happy if only the Jews were this cooperative.”
* = This sentence was changed to correct an error in the original version of this article, which wrongly stated that the FDA had approved Thalidomide.
The rise and fall of AZT: It was the drug that had to work. It brought hope to people with HIV and Aids, and millions for the company that developed it. It had to work. There was nothing else. But for many who used AZT – it didn’t
RUMOURS about the drug had been circulating since early 1985 when word came from America that a company in Carolina had found a compound that was effective against HIV – at least in a Petri dish. Two years later, by the time AZT had been licensed for use, demand for it had grown to gigantic proportions.
By then, Aids patients had grown so desperate that they would sample any of the bootlegged underground therapies, some of which were probably life-threatening. With the arrival of AZT, doctors who had been powerless for so long against a syndrome about which they knew so little, at last had something they could give their patients that had passed stringent official tests.
In March 1987, when AZT was available on prescription for the first time, almost everyone with Aids wanted to take it, as did many who had tested positive for HIV. One of these was Michael Cottrell, a gay Englishman. He had tested positive for HIV in 1985 at the age of 22. He took AZT for several months in the late Eighties and suffered severe side-effects from the drug: chronic headaches and nausea, debilitating muscle fatigue. Cottrell felt much worse on AZT than he did off it. But he persevered because it seemed AZT was the only anti-Aids drug there was.
So Cottrell took it early in his infection: after all, if AZT was judged to be effective in treating Aids, then perhaps, it was thought, it would also benefit those who took it before they became ill. AZT spelt hope: psychologically it served to dispel despair. It was never claimed to be a cure, but it did claim to keep you alive longer, and in that extra time it bought, who knew what would happen? Maybe a cure would be found. Maybe a vaccine. Maybe other drugs would be developed to fight the disease, too.
Cottrell still has boxes of AZT capsules at home. He gave up on it after several months, because he couldn’t stand how ill he was feeling on the drug; he felt as though his immune system was being damaged rather than strengthened; he believed he had never encountered a drug as toxic as AZT.
Cottrell knew the drug didn’t work for him, but he believed he might have been one of the unlucky ones, like people who react badly to penicillin. Then a month ago he woke up to the news that the drug didn’t work on HIV at all, and that all his suffering had been avoidable.
Concorde, an Anglo-French programme, was the biggest clinical trial of AZT ever conducted: 1,749 patients over three years. It did not examine how effective AZT was in treating people who were seriously ill with Aids but, just as important, it looked at how effective the drug was in treating the millions of people with HIV, before they became unwell and showed Aids symptoms. Preliminary results of the trial were published in a letter in the Lancet, and made headlines worldwide. The results suggested that early intervention with AZT – for people who were HIV but had not yet developed any symptoms of Aids – was a waste of time. The study, organised by the British Medical Research Council and the equivalent body in France, reported that it made no difference to either mortality rates or disease progression if one took AZT before the onset of Aids.
In a ‘blind’ test, AZT was given to 877 people and 872 were given a placebo. As soon as a patient developed any Aids symptoms, he or she (15 per cent were women) would be offered ‘open-label’ AZT. The mortality rates appeared to be shocking: over the three years of the trial, there were 79 Aids-related deaths in the AZT group, but only 67 in the placebo group. The researchers explained that among so many patients this figure was not statistically significant, but if you were HIV-positive and read of this in the newspapers, you were bound to question all the great claims that had been made for AZT. More people got Aids and died on Concorde than on any previous trial.
There were other causes for concern. Those on AZT developed more side-effects than those on the placebo. The results of the tests also cast doubt on one of the fundamental ways we measure a person’s immunity to disease. Those given AZT early increased their ‘CD4’ or ‘T4’ cell count; these are the cells attacked by HIV, and their numbers drop as the disease spreads. But the fact that, even with this higher count, patients did not live longer or develop the disease more slowly, struck at one of the basic tenets of Aids research.
Cottrell told the news to his 28-year-old partner Karl Burge, who had been diagnosed as HIV-positive four years ago, and they decided to take action. But what could they do? They had already joined protests against Wellcome plc, the British company that made AZT and had reaped millions in sales and share profits. Wellcome executives had listened to their complaints, and had admitted to certain levels of toxicity in AZT, but claimed that their product still had great beneficial effects. They were not readily going to halt production of the drug that last year made them pounds 213m, their second biggest earner.
So Cottrell and his friends selected a new target, the Terrence Higgins Trust. This was a strange choice: the trust, Britain’s most prominent Aids charity over the past 10 years, is staffed by dedicated professionals and volunteers providing a large range of support and information about all aspects of Aids and HIV; it developed the caring ‘buddy’ system; it produced information for schools; it sat on many Aids research panels and often met government departments.
So what had it done wrong? It had taken money from Wellcome plc and included positive information about AZT in its many leaflets and documents. Cottrell and his friends felt they were being betrayed by the very organisation that they had believed existed to act in their best interests; they felt that what was once an invaluable institution was acting as a mouthpiece for a multinational pharmaceuticals company.
Last week, Cottrell and Burge were still pitched outside the Terrence Higgins Trust office in central London, four weeks after their protest began. On Wednesday they were arrested and charged with a public order offence after a member of the trust called the police. The protest is growing by the week. They have been joined by John Stevens, diagnosed HIV- positive more than eight years ago, and who also had bad experiences with AZT, and Pierre Hardy, diagnosed HIV-positive four years ago when he was 27 and had felt devastated by its effects. Many other protesters carry placards, collect signatures, hand out leaflets. You will not find a more potent symbol of the complex story of AZT, a story of how the struggle to find a ‘magic bullet’ to help millions of people has degenerated into a saga of distrust, confusion, and anger. It is a story of health and illness, but it is also a story of scientific ambition, secrecy and political pressure, and of the amounts of money that can be generated when a lethal virus turns into a worldwide epidemic.
IN 1964, Jerome Horwitz was working in his laboratory at the Michigan Cancer Foundation when he had what he hoped was a brilliant idea. At 45, Dr Horwitz was the foundation’s director of chemistry, and although not in the scientific premier league, was a respected local researcher with his own lab and assistants. He had spent much of the previous decade doing what many of the world’s leading scientists had done – working on a cure, or at the very least an effective treatment, for cancer.
He developed a theoretical solution: what was needed was a chemical that would insert a ‘phoney’ compound into the DNA ‘building block’ of a cell to prevent its replication. After years of research, Dr Horwitz came up with
He tried his new compound on leukaemic mice, but it had no effect. Horwitz didn’t know why, but AZT didn’t work.
Horwitz never became famous. Recently he said AZT ‘was a terrible disappointment . . . we dumped it on the junkpile. I didn’t keep the notebooks.’ The compound remained ‘on the shelf’, occasionally tried by other researchers but always found to be useless. There was no reason to patent it. But 20 years later, Burroughs Wellcome brought it back to life.
THE WELLCOME group was founded in London by two Americans in 1880. Its first significant achievement was the creation of the tablet – previously most medication had been administered in powder form. In the 1930s the group was split into two distinct parts: the Wellcome Trust, a large charity which devoted its income to scientific research and the maintenance of an institute and library concerned with the history of medicine; and the Wellcome Foundation Ltd, a profit-making pharmaceuticals company that was called Burroughs Wellcome in the United States. In the course of its research, Wellcome employees have won five Nobel prizes.
By 1980, Wellcome had specialised in the treatment of viruses for more than 15 years, and its anti-viral drugs accounted for the bulk of its income. In that year, David Barry, a leading researcher at Burroughs Wellcome in the US, noticed that demand for its drug Septra – a drug that Wellcome had helped to develop a few years earlier to combat a rare form of pneumonia – was suddenly on the increase. Previously this pneumonia, known as PCP, was prevalent only in children with leukaemia, but now many doctors were requesting it for adult males. Most of these men were gay, and living in New York and San Francisco.
Two years later, another new Wellcome drug, Zovirax, was in great demand among the same group of people. Zovirax was an anti- herpes treatment. Dr Barry was very disturbed by the sudden demand for these two drugs.
Aids (Acquired Immune Deficiency Syndrome) was first classified as a new disease in 1981, but it was not until 1984 that the cause was identified as HIV (Human Immunodeficiency Virus). This cause has since been challenged by several prominent molecular biologists, but it remains the cornerstone of Aids research. And if any company was ideally equipped to conduct research into combating a new virus, it was Wellcome.
It was only natural for Barry to devote much of the company’s research resources to fight HIV. No one knew how widespread the virus or Aids was or would become. In 1984, only about 3,000 people had been diagnosed with Aids, but some early forecasts were terrifying: millions of people might already be infected, and hundreds of thousands could die within the next few years. Any scientist could see that Aids was potentially a career-making race to the Nobel prize. Millions might be made from a successful treatment.
After a few years of government inactivity – shameful years in which this new disease was virtually ignored – political ambition added to the desire to find a treatment. Health departments noticed that it wasn’t just homosexuals who were being struck down, but also hundreds of haemophiliacs and drug users. A certain amount of official panic took hold: by the time Rock Hudson died in the summer of 1985, it was clear that anyone – even film stars – could be in the frontline.
According to Wellcome’s own three-page account, research into HIV began in June 1984. During mass testing of scores of anti-viral
compounds, a substance known at first only as Compound S was found to inhibit viruses in animal cells. Compound S was AZT, a resyn- thesised version of what Horwitz had made 20 years before (Wellcome credits Horwitz in its account, but spells his name wrong).
In November 1984, according to the Wellcome account, the company sent samples of AZT to Duke University in North Carolina, the Food and Drug Administration (FDA) and the National Cancer Institute for independent testing, and within a few weeks the results confirmed what Wellcome already believed: that the stuff worked against HIV in test-tubes under laboratory conditions. Wellcome had already progressed further than Horwitz, but the real test – its effect on humans – was fraught with danger.
But first there is another account of the development of AZT to consider. A US government official named Sam Broder believes he has far more claim to being ‘Mr AZT’ than anyone at Burroughs Wellcome. Broder, the director of the National Cancer Institute, claims that Burroughs Wellcome showed little interest in developing an anti-Aids drug.
Broder went on a tour of pharmaceuticals companies towards the end of 1984, imploring them to send any possible anti-viral compounds to his lab for testing in safe conditions. ‘I went to one prestigious company, hat in hand,’ he told the business writer Bruce Nussbaum, whose book, Good Intentions, traces a history of the search for anti-Aids drugs. ‘I got about one minute and thirty seconds of a high-ranking officer’s time. It was very disappointing for me. It was emblematic of the issue. There was no real interest in it.’
Broder then went to Burroughs Wellcome. He says: ‘They made it clear that on the basis of 3,000 patients, there was no way they could practically get involved.’ Broder says he then became abrasive. ‘As I left, I said, ‘You know, we’re going to have more than 3,000 cases. It is going to be commercially viable for you . . .’ ‘
Whoever pushed who, the drug came through. When Broder found that the AZT sent to him by Burroughs Wellcome in November 1984 worked against the virus, he assured the company that every effort would be made to get this great new drug to dying patients as soon as possible. The FDA’s stringent testing requirements mean that most new drugs take between eight and 10 years to pass from development to the marketplace. AZT was pushed through in just 20 months.
This could have been the early history of almost any drug; the difference is, during what would normally have been an eight-year test period, for six of those years the drug was already on the market. At a time of desperation, this drug looked like the one that would restore hope. The National Cancer Institute had previously tried one other therapy, Suramin, which proved to be toxic in early tests, but AZT appeared to be far less poisonous. And so it was put on the ‘fast track’: the testing of some other drugs for less life-threatening illnesses was put aside; AZT was given top priority, an all-or-bust thing. But could any drug live up to the boundless hopes pinned on AZT?
THIS IS how AZT is supposed to work against HIV. HIV enters body cells, usually T4 white blood cells that play a crucial role in the orchestration of the body’s immune system. HIV is one of a group of viruses known as retro- viruses, which means that, unlike most living things that store their genetic information as DNA, HIV stores it as RNA. Before HIV can replicate, it must convert its RNA code to DNA by use of a special enzyme. It is during this conversion process that AZT works. When AZT enters the body, it is transformed into a molecule that closely resembles one of the building blocks of DNA. During the process of HIV conversion, this molecule is incorporated mistakenly into the DNA. The addition of this ‘phoney’ molecule makes the addition of further building blocks impossible and halts replication of the virus. It’s a form of chemotherapy. It worked fine under a microscope.
The first human tests were in two phases. The first examined whether AZT could be tolerated in the body at all, and whether it entered the brain, crossing the ‘blood-brain barrier’; to know this was important, because a common Aids symptom is dementia. The first Aids patient was injected with AZT in July 1985. This test concluded that the blood-brain barrier was crossed, and that although there were levels of toxicity detected, these were deemed to be safe.
The second phase of the tests, the final hurdle to the granting of a licence for mass production, was a shambles. It was set up six months later to establish whether AZT would combat Aids. This test, overseen by the Food and Drug Administration, involved 282 patients, all of them already ill with Aids or Arc (Aids-related complex). It was to be a placebo test, conducted over 24 months. It was to be a ‘double-blind’ study in which neither patient nor doctor knew whether the capsules being taken were AZT or starch. (But before the tests could begin, Wellcome had to produce large quantities of AZT, and found it couldn’t do it. It had run out of one crucial ingredient: herring sperm. Finally, Wellcome bought it in bulk from another company.)
At a press conference after the tests in September 1986, Wellcome reported that they had been a considerable success, such a success that the 24-week trial had been halted after 16 weeks for ‘ethical’ reasons. Mortality rates for people taking AZT were staggeringly lower than those taking the placebo; there had been 19 deaths in the placebo group of 137 people, but only one in the AZT group of 145. Those on AZT also had a decreased number of opportunistic infections and showed improvement in weight gain and T4 cell counts. Wellcome agreed in response to pressure from some sectors of the gay community that if AZT was effective, then dying people should be taken off the placebo at once.
No one claimed it was a cure, but there was huge relief that a breakthrough had been made. There had been much embarrassment when it became known that Rock Hudson had attended the Pasteur Institute in France for treatment; now at last America was showing those foreigners a thing or two. Robert Windom, assistant health secretary, said that ‘treatment with AZT prolongs survival of persons with Aids’. The results were ‘exciting’.
It was not suitable for everyone, but it was the best thing yet. In fact, it was the only thing. Last year, interviewed in the Wellcome in- house magazine, David Barry said that ‘the staff at Wellcome can tell our children, grandchildren and great-grandchildren that we were there, that we made a difference’. When it was shown that AZT worked, ‘we . . . first had a frenzied, cheerful celebration, and then a very quiet one. The longer we considered the global implications, the greater the accomplishment we realised Wellcome had made in the control of the HIV epidemic.’
But a few months after AZT was made available, John Lauritsen, a journalist working on the gay newspaper New York Native, obtained test documents through the Freedom of Information Act that suggested that many rules had been broken in the trials. The trial had been ‘unblinded’ within weeks: some patients claimed they could tell what they were taking by taste; others were so keen to have AZT that they pooled their treatment with other patients to increase their chances of receiving the drug. The documents showed that almost half the AZT patients had received numerous blood transfusions in the course of the trial, because of damage to their bone marrow and immune systems; and that a few had to be taken off AZT altogether.
What happened after the trial ended suggested something more alarming about AZT. After 16 weeks, one AZT patient was dead, compared to 19 placebo patients; a week later two more patients on AZT had died, compared to four more on the placebo. The ratio had switched from 19:1 to 23:3, which suggested AZT might only be effective for a limited time.
If the trial had continued, the ratio might have narrowed even more. The tests would probably still have shown that AZT has some benefits for very ill patients, but with hindsight it is alarming that a new drug was allowed to be
released with so much left to prove. People at Wellcome now put it down to the mood and the severe pressure of the times. Dr Trevor Jones, Director of Research at Wellcome, who has been involved in their development of AZT from the beginning, acknowledged that the trials were subject to extraordinary pressures. ‘Much of these accusations (about the breakdown of trial protocol) took place, not at that stage, but later on, when the drug was showing benefit in a less sick population.
All sorts of things we heard stories about, and some of them I think we can confirm from our data. Patients would go to their doctor, get their treatments, and rather than risk the uncertainty (or receiving the placebo), they’d put the two together, mix them and divide them by half. We know this, because people who were supposed to be on the placebo already had drug levels in them.’
Much of the pressure came from people with HIV and Aids, and their carers, who wanted the drug released immediately. It was unacceptable to administer a placebo, they argued, if AZT worked. And there was no point having a drug released on the market in 10 years – by that time hundreds of thousands would be dead.
Burroughs Wellcome and many other independent research institutions would spend every subsequent year trying to supplement their data on AZT, trying to find out all the things that would normally be known about a drug before it hit the market. In these later years AZT was to become for many people the symbol of all that was wrong with Aids research. Once AZT was shown to have worked, almost all available funds were channelled to support its development and other potential treatments, along with any doubts that HIV was the cause of Aids, were swept aside.
BUT IN 1986, AZT was unstoppable. It suited the FDA, because it showed the administration was doing something. It suited Wellcome, because it now had a patent on AZT (and by 1986, with the epidemic increasing alarmingly, there was no doubt that the financial rewards would be enormous). It suited doctors, because they believed they could help their patients. And it certainly suited people with Aids. Some people had doubts, but hell, if you were ill and dying you wanted to believe. After all the despair and uncertainty, people in authority were saying ‘take this, it’ll do you good’.
Cottrell was one of the first people to take AZT in Britain. He was prescribed it in 1986, before it was widely available, when he was 23.
‘I had recently been diagnosed HIV-positive, and I went into a panic. I thought I was going to die. I remembered something about this drug coming from America and everyone clamouring to get it. I was perfectly healthy. My boyfriend’s blood count was quite low, and he was prescribed it by St Stephen’s Hospital, and I took it too. Intuitively, I didn’t think it was doing me any good. I was prescribed it three times over a period of three years, and I took it out of fear. I was first prescribed 1,200mg a day, and then 500mg, but I still felt bad, even on the lower dose. I had nausea and headaches and muscle fatigue.’
Cottrell took it every four hours, which meant he had to have a bleeper that woke him at three or four o’clock every morning. (People joked that the real Aids money lay in making these bleepers; in New York in the late Eighties, opera performances were punctuated by bleeps.) Cottrell stopped taking AZT after a few weeks, but then he got scared, and began taking it once more. ‘I got my drugs every two weeks – a big plastic bagful. I felt that I was carrying my life around in that bag.’
His friend, Pierre Hardy, was diagnosed HIV in 1989, when he was 28. At a specialist clinic he was given a sheet of paper which explained that AZT was the most efficient treatment, but also that it hadn’t been around long enough for anyone to know the long-term effects. Like most people in his position, he said he’d try anything, and he was prescribed 500mg a day.
‘My T4 count went up along with my general health in the first year, and everything settled down. I had been on AZT for three years, and my T4 count was levelling between 400 and 600 (an average T4 count in healthy adults is between 800 and 1,000). And then last year I started to get sick. I had repeated chest infections, and in November 1992 I had a stroke. I was hospitalised in a specialist ward. I asked them for my T4 count, and when they came back, they were were uncomfortable about it. My T4 count was 90. I thought I was finished.
‘When I got home and started to review the whole thing, the whole HIV theory. I threw away all the pills I was taking – I was taking seven every morning and evening. I started to change my diet, and then I went back to my doctor. When I had my new T4 count it was 545. I’ve had three migraines since January, a little bit of asthma coming back, but basically I feel much better. If I’d continued to believe in the traditional medicine sytem I would have been dead either this year or next year.’
Two weeks ago Hardy met a volunteer with the Terrence Higgins Trust, who told him that he and his boyfriend were taking AZT and it was working like a dream.
‘I asked him how long they were on it. He said four months. I said that that was the trap that everyone was falling into. The AZT will work for you for a little while, for the maximum of one year, as it did for me, and afterwards the damage became visible.’
Most people with Aids, and many with asymptomatic HIV, take or have taken AZT. Other drugs have emerged in the past few years that work in a similar way – DDC (produced by the Swiss company Hoffmann-La Roche) and DDI (made by the American company Bristol-Myers Squibb), but AZT is still the market leader. It is hard to think of another product that is so dominant in its field. You read the showbiz autobiographies and those three little letters snap out of the page.
Earvin ‘Magic’ Johnson, the basketball star who tested HIV-positive in October 1991, was advised to take AZT immediately. He agreed. ‘There was a lot of public interest in the fact that I was taking AZT, which was originally used only in the later stages of the illness,’ he explained in My Life, his autobiography. ‘These days it’s used as a preventative, but not everybody knew that. That may be why some people, including a few reporters, concluded that I was sicker than I actually was.’ People wrote to Johnson telling him that AZT was not the answer. Somebody advised him to drink all his blood and replace it with new blood. ‘Even now I can’t go anywhere without somebody coming up and saying, ‘I know this friend who knows this doctor who has a cure’.’
Rudolf Nureyev, who died in January, began taking AZT in 1988. ‘AZT was just beginning to be used in France,’ said Michel Canesi, his doctor. ‘I didn’t want to give it to him straight away, because I was worried that the side-effects would hamper him (Nureyev was still dancing at this time). Rudi lost his temper and said: ‘I want this medicine.’ I replied that there hadn’t been long enough to judge the results. But I had to give in and prescribe it – he was so insistent. But he didn’t take it regularly. He went off every time with tons of drugs, and every time I went to see him I found unused packets all over the place.’
The film-maker Derek Jarman, who was diagnosed HIV-positive in 1986, has found AZT beneficial. ‘It works – it holds everything up. It stops the virus replicating. At the beginning they gave people much too massive doses, which affected us physically. I had no recognisable toxic side-effects from it. I began taking it in September 1990, I think, and I came off it last August.
‘I was invited by my doctor to make up my mind whether I took the drug or not, so I rang up various people in America and the general advice was to take it – and this was advice was from quite radical people, not people in with the Wellcome Foundation.
‘I came off it because my doctor said that my (T4) count was down. We’ve never discussed it since. He just suddenly said, ‘I think you’ve had enough AZT, Derek’, and I very much trust him, he’s a brilliant doctor. The whole thing is so complicated, because I took a lot of other drugs as well. I had to have suppressants for TB, toxoplasmosis and PCP. And then obviously if I got an infection there was fluconazole and all of that area. And then at a certain point they added hydrocortisone and fludrocortisone to keep my energy up.’
Jarman has recently been in hospital. ‘At the moment I’m actually on nothing. I’ve had a skin complaint and they decided it would be very sensible to take me off all my pills, and then go back on the drugs to see if they were causing the skin complaint. They can obviously play around with the drugs.
‘My feeling about AZT is that I’m glad I took it, even though I can’t prove to you that it did anything. You can say that if it helps someone psychologically then it must be doing some good. I think the doctors generally feel that it does some good. But how do you know?’
FINANCIALLY, Wellcome plc has done extremely well out of AZT. Retrovir, the drug’s brand name, accounts for more than 13 per cent of its total income, and yielded pounds 213m last year. As the only big earner to have been launched by the company in the past decade, the continued success of AZT is crucial to its growth. The company will be well aware that at the end of last year the World Health Organisation estimated that about 13 million men, women and children have been infected with HIV since the start of the pandemic. (A large proportion of these cases are in sub-Saharan Africa and South and South-east Asia, where AZT and other anti-Aids treatments are unlikely to be available or affordable; the figure for HIV infection in the Americas and Western Europe is estimated at 2.5 million.)
Part of the Wellcome Foundation was floated on the stock market in 1986, the year of the AZT breakthrough. Subsequent rises in share prices have been directly linked to the fortunes of the drug and the results of new trials. In February 1987, the share price jumped 73.5p to 374.5p on the news that AZT would be widely available in the US at dollars 188 for 100 capsules, an extremely high price for a new drug, and one that would yield large profits (this translated to about dollars 10,000 a year for every user). By November 1989 the share price had almost doubled to 724p; year-on-year pre-tax profits were up 28 per cent to pounds 283m. In early 1993, the share price was at 810p; last year’s pre-tax profits were pounds 505m.
‘In terms of the emotive quality of the demand, there’s never been a drug like it,’ said Martin Sherwood, a Wellcome spokesman, shortly after AZT’s launch. It was just this emotive demand that led to the picketing of the Wellcome shareholders’ meeting in January 1990. Act Up (the Aids Coalition to Unleash Power, co-founded by the playwright Larry Kramer) picketed the AGM at Grosvenor House in London, describing it as ‘a gathering of Aids profiteers’. Activists complained about the price of AZT, and what they saw as Wellcome’s reluctance to provide all available information on the drug.
Wellcome shareholders were irritated by this intrusion, not least when Act Up members interrupted the meeting and insisted on talking to Sir Alfred Shepperd, the outgoing chairman. But Wellcome executives were baffled: they believed they had done everything they could to benefit people with HIV and Aids, certainly more than any other pharmaceuticals company. Was it not these very same activists who had celebrated when AZT was launched three years earlier? At first Wellcome defended its pricing on the grounds that AZT took dollars 80m to develop and produce (later revised to dollars 30m), but it soon bowed to pressure (and its economies of scale) and cut the price. The recommended dosage was also reduced for medical reasons, which meant many more people could tolerate its toxicity. Today AZT costs about dollars 3,000 per person per year, or about pounds 2,000.
As would be expected, Wellcome plays up the good news. When, in 1989, two double- blind placebo trials of the effects of AZT on asymptomatic and less seriously ill patients showed that it could delay the progression of the disease, much was made of the results and the share price rose by 30p. But when, four months later, the company admitted that AZT had caused cancer in rodents, it explained that the rats and mice were given 10 times the dose prescribed to humans, and that several other drugs in use by humans had also produced tumours in animals when administered over long periods. Wellcome’s share price went down one penny.
Wellcome’s PR machine is an impressive force, and much money is spent on convincing the media of AZT’s worth. You go and see them and you get a lot of bumph: how AZT works, why it is more effective than other anti- retrovirals. Wellcome house-magazines talk of the extra 400,000 productive years of life it has made possible through the drug, about how many thorough and independent studies have stressed AZT’s efficacy.
‘The number of people who have shown agression against us concerns us no end,’ says Trevor Jones. ‘Normally the company tries to distance itself from the patient / physician interaction – it must do. The day-to-day therapy of the patient is not our responsibility. But about three years ago we started to open our labs to people with HIV and their carers, contrary to the advice of my security and other colleagues. You then realise the uncertainty and the frustrations involved in that act of taking a tablet for the very first time. When people with HIV came through the door of the lab I could almost touch their anger. But I realised that the anger was not really about Wellcome or me, but about their mortality. They were frustrated, and saying, ‘Please, please what can I do?’ These were genuine cris de coeur.’
Dr Jones is one of the few pharmaceutical industry representatives on Britain’s Medical Controls Agency. Wellcome has clearly selected its spokesman with care. ‘People say we’re purely acting out of commercial interests, but it is not in our commercial interests to do anything else but get this drug right,’ Dr Jones says. ‘We wanted to show people that we are working night and day, weekdays and weekends trying to develop better medicines. Otherwise we look like ogres and robber barons all the time. That’s the whole history of our business; if you’ve got a problem with a product, you must, you must, you must tell people. The criticism hurts a lot; our integrity as a scientific body is important to us. I don’t take too kindly to people saying, ‘Oh, you don’t want to listen to Wellcome, because they would say that, wouldn’t they?’ You can’t hide anything in this business, because otherwise who will trust us when we develop another drug, like the new epilepsy drug we’ve got now? You have to believe that the integrity of science is good.’
Jones has had a bad few weeks. Wellcome’s share price was hammered by last month’s Concorde trial report, falling 10 per cent to 670p, before rallying to 692p. Five days after the report appeared, Wellcome staged a damage limitation exercise, at which Jones told a press conference that he was unhappy with the way the results were released, without peer review or advice to patients, and saying it had caused panic among those with HIV. He said that the full results had yet to be released, and hoped that a more beneficial picture of early intervention with AZT would emerge at the ninth International Aids Conference in Berlin in June. He also outlined that the protocol of the study had changed from that agreed in 1988. When an American study reported in 1989 that AZT did have beneficial effects on people with asymptomatic HIV, the Concorde officials decided that people on its trial could switch to AZT if they wanted to; this may have led to a diluting of the results.
Last week, Jones reiterated why AZT may still be beneficial, and why doctors should continue to prescribe the drug early. ‘We have gathered together 10 studies on asymptomatic patients. Five of these are control studies with placebos, and five are cohort studies, in which we simply give the drug and observe what happens. These studies involved more than 6,500 patients and ranged from one to four years in duration. We believe we have accrued sufficient data to show that taking the drug when you’re asymptomatic does delay the onset of further symptoms.’
WELLCOME has a presence at all the chief Aids conferences, and will occasionally organise gatherings of its own. In June 1992 it launched Positive Action, ‘an international intiative’ in support of those with HIV and Aids. For the launch conference in London, journalists flew in from all over Europe to hear Wellcome executives describe how pounds 1m was being distributed to many educational organisations. An emotional climax of sorts was provided by Jerry Breitman, the company’s US director of professional relations. He was there to present the ‘workplace initiative’, and his speech contained a little surprise at the end. Like the wig salesman whose coup de grace is to rip off his own toupee, Breitman declared himself HIV-positive. ‘I thought long and hard before deciding to tell my management,’ he revealed. ‘But . . . when you are part of an enlightened organisation such as Wellcome, I am absolutely convinced that communicating your HIV infection is a positive action . . . It is, truly, one of the best decisions I have made in a very long time.’ A few journalists felt distinctly queasy at the theatricality of it all.
One of the initiatives raised was Wellcome’s involvement with the Terrence Higgins Trust. This first surfaced in 1991, with the publication of four information leaflets. Two months ago staff at the Trust and volunteers read in their newsletter that the link had been strengthened. The newsletter explained that ‘THT, along with the Wellcome Foundation, is about to begin producing an important new medical information series. THT are providing a series of medical updates for all staff and volunteers. We will be providing them on a regular basis every two months in the evening. Costs will be met by the Wellcome Foundation, which also funds our series of general booklets.’
Nick Partridge, chief executive of the trust, is dubbed ‘Nick the Sick’ on the placards carried by the protesters outside his office. Partridge, in reply, calls them ‘New Age flat- earthers who have a naive hope that Holland & Barrett will produce a herbal tea that will be effective against HIV.’ Partridge said that the trust actively pursued funding from a wide range of companies and government agencies, and that it was ‘quite clear that none of that funding involves an ability by those companies to influence the information we produce. We would be neglecting our duty if we were not in regular contact with Wellcome, Bristol-Myers Squibb and Roche, arguing for greater investment in HIV research and fair and balanced information. The leaflets are not about treatment issues.’
But once they were. In 1991 the trust produced a 24-page booklet on HIV and its treatment; nine pages were devoted to AZT, but only half a page was given to other therapies. The copyright on the leaflet was held by the Wellcome Foundation, which also paid for its printing. ‘It was only available for eight months,’ Partridge says. ‘Information changes quite rapidly. The main fault of that leaflet is that it is too hopeful. By 1991 the hopes around early intervention had probably gone further than we realise, in retrospect, was wise. The desire by many people with HIV to say, ‘Yes, we can live with this infection’ meant that a lot of hope was invested in the theory of early intervention. For all its faults, our leaflet was still a lot more realistic than the material that Wellcome was putting out on its own. Remember that over the years, there have been many stories of breakthroughs that proved to be wildly optimistic.’
FOR MOST people with HIV, the AZT dream is over. AZT is the future that was; no one believes in the ‘magic bullet’ any more. It does have benefits for some patients who are seriously ill, but there is now severe doubt over its other uses. This, after the drug has been subjected to more tests, and has been the subject of more post-launch research papers, than perhaps any other modern therapy.
The future for HIV and Aids treatment appears to be in combination treatment – the use of AZT and DDC and DDI and many other compounds used in all manner of variations. Several trials are in progress. Two weeks ago it was announced that Wellcome has joined forces with its competitors Hoffman-La Roche, Bristol-Myers Squibb, Glaxo, SmithKline Beecham and 15 other companies, in an attempt to pool their research knowledge and find an effective treatment.
Wellcome is also developing some other anti-Aids drugs on its own. We won’t hear about these for a while; the company doesn’t want to raise any hopes.
Jerome Horwitz, the man who created AZT in 1964, is still active in medical research. He’s 74 now, but you can still reach him most days at the Meyer L Prentis Cancer Center in Detroit. Occasionally he does a little Aids work, but most of his time involves cancer chemotherapy.
Horwitz believes AZT is not the answer to HIV and Aids, but has hopes for combination therapies (he was also the first to synthesise DDC). He concludes that AZT ‘buys time’.
‘We were certainly on the cutting edge,’ he says of his work in the mid-Sixties. ‘When the pharmacologist said, ‘Look, Dr Horwitz, your compounds are not effective against leukaemias and I see no future for them’, that was like a blow to the solar plexus. We had great hopes. ‘I remember one of my students saying at the time that we had a great series ofcompounds just waiting for a disease totreat. It took 25 years before our beliefs were vindicated.’
The first Horwitz heard of AZT’s use against HIV was when he read about it in the Wall Street Journal. Burroughs Wellcome established a chair in his name at the Michigan Cancer Foundation, but he has received no financial reward.
‘My wife sits across from me at the
breakfast table and reminds me of all the
money that Burroughs Wellcome has got out of it and I haven’t got a dime. I keep telling her about the legacy I’m leaving. But I wouldn’t be being absolutely straight with you if I hadn’t thought that I should have gotten something out of it.’-