Category Archives: General Interest

A History Of The Bible: Who Wrote It- And When?

The origins of the Bible are still cloaked in mystery. When was it written? Who wrote it? And how reliable is it as an historical record? BBC History Revealed magazine charts the evolution of arguably the most influential book of all time

In 2007, Time magazine asserted that the Bible “has done more to shape literature, history, entertainment and culture than any book ever written”.

It’s a bold claim, but one that’s hard to refute. What other book resides on bedside tables in countless hotel rooms across the globe? What other book has bequeathed the world such instantly recognisable catchphrases as “an eye for an eye”, “thou shalt not kill” and “eat, drink and be merry”?

Factor in the number of copies that have been sold down the centuries – somewhere in the region of five billion to date, swollen by a further 100 million every year given away for free – and there’s no denying that the Bible’s influence on Western civilisation has been monumental.

But if the Bible’s standing as a cultural behemoth is beyond doubt, its history is anything but. For centuries, some of the world’s greatest thinkers have puzzled over the origins and evolution of this remarkable document. Who wrote it? When? And why?

These are the thorniest of questions, made all the more tangled by the Bible’s great age, and the fact that some, or all of it, has become a sacred text for members of two of the world’s great religions – Judaism and Christianity – numbering more than two billion people.

An illumination from a Byzantine manuscript depicting Jesus Christ. (Photo by Werner Forman/Universal Images Group/Getty Images)

Where does the Bible originate?

Archaeology and the study of written sources have shed light on the history of both halves of the Bible: the Old Testament, the story of the Jews’ highs and lows in the millennium or so before the birth of Jesus; and the New Testament, which documents the life and teachings of Jesus. These findings may be incomplete and they may be highly contested, but they have helped historians paint a picture of how the Bible came to life.

Perhaps the best place to start the story is in Sun-baked northern Egypt, for it was here that the Bible and archaeology may, just may, first collide.

For centuries, the Old Testament has been widely interpreted as a story of disaster and rescue – of the Israelites falling from grace before picking themselves up, dusting themselves down and finding redemption. Nowhere is this theme more evident than in Exodus, the dramatic second book of the Old Testament, which chronicles the Israelites’ escape from captivity in Egypt to the promised land.

But has archaeology unearthed one of the sites of the Israelites’ captivity?

That’s the question that some historians have been asking themselves since the 1960s, when the Austrian archaeologist Manfred Bietak identified the location of the ancient city of Pi-Ramesses at the site of the modern town of Qantir in Egypt’s Nile Delta. Pi-Ramesses was the great capital built by Ramesses II, one of Egypt’s most formidable pharaohs and the biblical tormentor of the Israelites. It’s been argued that Pi-Ramesses was the biblical city of Ramesses, and that the city was built, as Exodus claims, by Jewish slaves.

It’s an intriguing theory, and one that certainly has its doubters. But if it were true, it would place the enslaved Israelites in the Nile Delta in the decades after 1279 BC, when Ramesses II became king. So what happened next?

The Bible is in little doubt. It tells us that Moses led the Israelites out of their captivity in Egypt (whose population had been laid low by ten plagues inflicted on them by God) before Joshua spearheaded a brilliant invasion of Canaan, the promised land. The historical sources, however, are far less forthcoming. As John Barton, former professor of the interpretation of holy scriptures at the University of Oxford, puts it: “There is no evidence of a great invasion by the Israelites under Joshua; the population doesn’t seem to have changed much in that period as far as we can tell by archaeological surveys.”

St Catherine’s Monastery in the shadow of Mount Sinai, where the Codex Sinaiticus came to scholars’ attention. (Image by RF CREATIVE/Getty Images)

In fact, the best corroborating evidence for the Bible’s claim that the Israelites surged into Canaan is Merneptah’s Stele.

What is Merneptah’s Stele?

Like all good autocrats, Merneptah, pharaoh of Egypt, loved to brag about his achievements. And when he led his armies on a successful war of conquest at the end of the 13th century BC, he wanted the world, and successive generations, to know all about it.

The medium on which the pharaoh chose to trumpet his martial prowess was a three-metre-high lump of carved granite, now known as the Merneptah Stele. The stele, which was discovered at the site of the ancient Egyptian city of Thebes in 1896, contains 28 lines of text, mostly detailing the Egyptians’ victory over the Libyans and their allies. But it is the final three lines of the inscription that has arguably excited most interest among historians.

“Israel has been shorn,” it declares. “Its seed no longer exists.” These few words constitute the first known written reference to the Israelites. It’s an inauspicious start, one that boasts of this people’s near destruction at the hands of one of the ancient world’s superpowers in their homeland of Canaan. But the Israelites would survive.

A replica of Merneptah’s Stele, now housed in the Egyptian Museum in Cairo. The last 3 of the 28 lines deal with a separate campaign in Canaan, then part of Egypt’s imperial possessions. (Photo by Universal History Archive/Universal Images Group via Getty Images)

And the story they would go on to tell about themselves and their relationship with their God would arguably eclipse any of Merneptah’s achievements. It would spawn what is surely the most influential book of all time: the Bible.

Merneptah’s Stele may describe more Jewish pain at the hands of their perennial Egyptian persecutors, but it at least suggests that they may have been in Canaan during Merneptah’s reign (1213–1203 BC).

If the early history of the Israelites is uncertain, so is the evolution of the book that would tell their story.

Who wrote the Bible?

Until the 17th century, received opinion had it that the first five books of the Bible – Genesis, Exodus, Leviticus, Numbers and Deuteronomy – were the work of one author: Moses. That theory has since been seriously challenged.

Scholars now believe that the stories that would become the Bible were disseminated by word of mouth across the centuries, in the form of oral tales and poetry – perhaps as a means of forging a collective identity among the tribes of Israel. Eventually, these stories were collated and written down. The question is by whom, and when?

A clue may lie in a limestone boulder discovered embedded in a stone wall in the town of Tel Zayit, 35 miles southwest of Jerusalem, in 2005. The boulder, now known as the Zayit Stone, contains what many historians believe to be the earliest full Hebrew alphabet ever discovered, dating to around 1000 BC. “What was found was not a random scratching of two or three letters, it was the full alphabet,” Kyle McCarter of Johns Hopkins University in Maryland has said of the stone. “Everything about it says this is the ancestor of the Hebrew script.”

The Zayit Stone does not in itself tell us when the Bible was written and collated, but it gives us our first glimpse of the language that produced it. And, by tracking the stylistic development of that language down the centuries, and cross-referencing it with biblical text, historians have been able to rule out the single-author hypotheses, concluding instead that it was written by waves of scribes during the first millennium BC.

Ask the expert: John Barton

John Barton is a former professor of holy scriptures at the University of Oxford and the author of A History of the Bible: The Books and Its Faiths.

Q: Just how reliable is the Old Testament as an historical document?

A: Some parts, such as the early chapters of Genesis, are myth or legend, rather than history. But parts of Samuel, Kings, Ezra and Nehemiah describe events broadly known also from Assyrian or Persian sources. For example, Jehu, king of Israel in the ninth century BC, appears on an Assyrian monument, the Black Obelisk, doing obeisance to the Assyrian king. From about the eighth century BC onwards, the Old Testament contains some real historiography, even though it may not all be accurate.

Q: Does it matter if it’s not historically accurate? Are we guilty of placing too much emphasis on this question?

A: I think we are. Much of the Old Testament is about seeing God at work in human history rather than in accurately recording the detail, and sometimes we exaggerate the importance of historical accuracy. The Old Testament is not a work of fiction, but nor is it a modern piece of history-writing.

Q: How much does archaeology support the historicity of the Old Testament?

A To a limited extent. It gives us a context within which the Old Testament makes sense, but it doesn’t confirm a lot of the details. It mustn’t be forgotten that archaeology has also yielded vast numbers of documents from the ancient near-east, such as Assyrian and Babylonian annals, which illuminate the Old Testament world.

Q: How much do we know about the scribes who wrote the Old Testament?

A: The scribes are never described in detail in the Old Testament itself, but analogies with Egypt and Mesopotamia make it clear that there must have been a scribal class, probably attached as civil servants to the temple in Jerusalem or the royal court. After the exile of the Jewish people in Bablylon in the sixth century BC, scribes gradually turned into religious teachers, as we find them in the New Testament.

Q: When was the Old Testament assembled into the book it is today?

A: Probably during the first century BC, though parts of it were certainly regarded as holy scripture much earlier than that. But the collection is a work of early Judaism. It should be remembered that for a long time it was a collection of individual scrolls, not a single book between two covers.

Q: Did the Old Testament anticipate the figure of Jesus Christ?

A: There are prophecies of a coming Messiah – which means ‘anointed one’ – occasionally in the Old Testament, and Christians claimed them as foretelling Jesus. But messianic hopes were not widespread or massively important in first-century Judaism and are even less central to the Old Testament itself. Christians discovered texts they saw as messianic prophecies – for example, in Isaiah 7 – though other Jews did not read them that way.

Q: Why did the New Testament gain so much traction in the first centuries AD?

A: The New Testament was accepted because it was part of the package of the Christian message, which was massively successful in the early centuries. The message, which was that all humankind was accepted through Jesus by the God worshipped by the Jews, proved a winner.

Who was King David?

The first wave of scribes may, it’s been suggested, have started work during the reign of King David (c1000 BC). Whether that’s true or not, David is a monumental figure in the biblical story – the slayer of Goliath, the conqueror of Jerusalem. David is also a hugely important figure in the quest to establish links between the Bible and historical fact, for he appears to be the earliest biblical figure to be confirmed by archaeology.

“I killed [the] king of the house of David.” So boasts the Tel Dan Stele, an inscribed stone dating from 870–750 BC and discovered in northern Israel in the 1990s. Like the Merneptah Stele before it, it documents a warlord’s victory over the Israelites (the man doing the gloating was probably the local ruler Hazael of Aram-Damascus). But it at least indicates that David was a historical figure.

The Tel Dan Stele also suggests that,no matter how capable their rulers, the people of Israel continued to be menaced by powerful, belligerent neighbours. And, in 586 BC, one of those neighbours, the Babylonians, would inflict on the Jews one of the most devastating defeats in their history: ransacking the sacred city of Jerusalem, butchering its residents, and dragging many more back to Babylonia.

For the people of Israel, the fall of Jerusalem was a searing experience. It created, in the words of Eric M Meyers, a biblical scholar at Duke University in North Carolina, “one of the most significant theological crises in the history of the Jewish people”. And, according to many scholars, that crisis may have had a transformative impact on the writing of the Bible.

The Old Testament is far more than a formulaic story of a nation’s evolution, it’s also a chronicle of that nation’s relationship with its God. Did the sack of Jerusalem in 586 BC convince a new wave of Jewish thinkers that they hadn’t been keeping their side of the bargain? Did it spur them into revisiting all previous editions of the Jewish scriptures in order to sharpen the emphasis on the agreement or ‘covenant’ between the people and their one God?

Whether this theory holds or not, there’s little doubt that by the time they returned from their Babylonian exile, the Bible occupied a unique place in the consciousness of the Jewish people. However, it would be centuries before the book would be revered as a secret text for non-Jews. And the reason for that transformation from national to international significance was, of course, the figure of Jesus Christ. It’s the so-called New Testament, the account of Jesus’s life and teachings, that turned the Hebrew Bible into a civilisationshaping, global icon.

Who was Jesus? Did he really exist?

Most scholars agree that Jesus, a first-century religious leader and preacher, existed historically. He was born in c4 BC and died – reportedly crucified on the orders of the Roman prefect Pontius Pilate – in cAD 30–33. Then, for around 40 years, news of his teachings was spread by word of mouth until, from around AD 70, four written accounts of his life emerged that changed everything.

The gospels, or ‘good news’, of Matthew, Mark, Luke and John are critically important to the Christian faith. It is their descriptions of the life of Jesus Christ that have made him arguably the most influential figure in human history.

“We can’t be sure when the gospels were written,” says Barton, “and we know little about the authors. But the guess is that Mark came first, in the 70s, followed by Matthew and Luke in the 80s and 90s, and John in the 90s or early in the second century.

“In general, Matthew, Mark and Luke tell the same story with variations, and hence are called the ‘synoptic’ gospels, whereas John has a very different style, as well as telling a markedly different version of the story of Jesus. Matthew and Luke seem to be attempts to improve on Mark, by adding more stories and sayings from sources now lost. John is a different conceptualisation of the story of Jesus, portraying a more obviously divine figure.”

Though the variations in the four gospels may have proved a source of frustration to those trying to paint a definitive picture of Jesus’s life and teachings, they offer a fascinating insight into the challenges facing the early Christian church as it spread around the Mediterranean world in the first and second centuries AD.

Mark, it’s been argued, wrote for a community deeply affected by the failure of a Jewish revolt against the Roman empire in the AD 60s, while Luke wrote for a predominately Gentile (non-Jewish) audience eager to demonstrate that Christian beliefs could flourish within the Roman empire. Both John and Matthew hint at the growing tensions between Jewish Christians and the Jewish religious authorities.

As a Jew, Jesus would have been well-versed in the Hebrew Bible and, according to the gospels, saw himself as the realisation of ancient Jewish prophecies. “Don’t think that I came to destroy the law, or the prophets,” Matthew reports him saying. “I didn’t come to destroy, but to fulfil.” But for all that, by the time the gospels were written, schisms between Judaism and nascent Christianity were clearly emerging.

How did Christianity spread around the world?

The Epistles, or letters, written by Paul the Apostle to churches dotted across the Mediterranean world – which are our best source for the initial spread of Christianity – confirm that Christianity started in Jerusalem, but spread rapidly to Syria and then to the rest of the Mediterranean world, and was mostly accepted by non-Jews, says John Barton, former professor of the interpretation of holy scriptures at the University of Oxford.

“The epistles [which make up 13 books of the New Testament] are our earliest evidence for Christianity,” says Barton. “The first date from the AD 50s, just two decades after the death of Jesus.”

As Paul’s letters to churches such as the one in the Greek city of Thessalonica reveal, the first Christian communities were often persecuted for their beliefs.

And it’s such persecution, particularly at the hands of the Romans, that may have inspired the last book of the New Testament, Revelations. With its dark descriptions of a seven-headed beast and allusions to an imminent apocalypse, Revelations is now widely believed to be a foretelling of the grisly fate that the author believed awaited the Roman oppressors of Christianity.

Despite that oppression, by the fourth century Christianity had become the dominant religion in the Mediterranean world, with the New Testament widely revered as a sacred text inspired by God. “It was around this time,” says Barton, “that the 27 books of the New Testament were copied into single books as though they formed a single work.” One example is the Codex Sinaiticus, now in the British Library. “The first person to list exactly the books we now have as the New Testament is the fourth-century bishop Athanasius of Alexandria, but it’s clear that he was only reporting what was already widely accepted.”

By the end of the early fifth century, a series of councils across the Christian world had effectively rubber-stamped the New Testament that we know today: the Bible’s journey to being the most influential book in human history was well and truly under way.

Versions of the Bible

Different editions of the Bible have appeared over the centuries, aiming to further popularise the stories and teachings within. Here are three of the most notable versions…

King James Bible

On 24 March 1603, King James VI of Scotland was also crowned King James I of England and Ireland. His reign would usher in a new royal dynasty (the Stuarts) and a new era of colonialism (most especially in North America). But arguably every bit as significant was his decision, in 1611, to introduce a new Bible.

The ‘King James Version’ (KJV) wasn’t the first to be printed in English – Henry VIII had authorised the ‘Great Bible’ in 1539 and the Bishops’ Bible had been printed during the reign of Elizabeth I in 1568 – but, in terms of impact, the KJV would dwarf its successors.

Shortly after his coronation, James was told that existing translations of the Bible were “corrupt and not answerable to the truth of the original”. What his scholars produced was a book designed to be read out aloud in church – fast-paced, easy to understand, a masterclass in storytelling.

No other version would challenge its dominance in the English-speaking world until the mid-20th century. According tob historian Adam Nicolson, the King James Bible’s “particular combination of majesty and freedom, of clarity and richness, was for centuries held, particularly by the Victorians, to be the defining terms of our national identity”.

The Gutenberg Bible

In 1454, in the Rhineland town of Mainz, three friends – inventor Johannes Gutenberg, printer Peter Schöffer and financier Johann Furst – pooled resources and brainpower to come up with what the British Library describes as “probably the most famous Bible in the world”.

The Gutenberg Bible, as the three friends’ creation would come to be known, signalled a step-change in printing techniques. Whereas earlier Bibles were produced by printing presses that employed woodblock technology, the press that churned out the Gutenberg Bible used moveable metal type, allowing more flexible, efficient and cheap printing.

Gutenberg’s Bible also had massive cultural and theological ramifications. Faster, cheaper printing meant more books and more readers – and that brought with it greater criticism, interpretation, debate and, ultimately, revolution. In short, the Gutenberg Bible was a significant step on the road to the Protestant Reformation and ultimately the Enlightenment.

In the words of Professor Justin Champion of Royal Holloway, University of London: “The printed Bible in the hands of the public posed a fundamental challenge to papal dominion. Once released from Latin into the vernacular, the word of God became a weapon.”

Dead Sea Scrolls

Sometime between November 1946 and February 1947, a Bedouin shepherd threw a stone into a cave at Wadi Qumran, near the Dead Sea. When he heard something crack he headed inside to investigate. What he found has been described by the Smithsonian Institute as “the most important religious texts in the Western world”.

What the shepherd had chanced upon were the Dead Sea Scrolls, more than 800 documents of animal skin and papyrus, stored in clay jars for safe keeping. Among the texts are fragments of every book of the Old Testament, except the Book of Esher, along with a collection of previously unknown hymns and a copy of the Ten Commandments.

But what really makes the scrolls special is their age. They were written between around 200 BC and the middle decades of the first century AD, which means they predate by at least eight centuries the oldest previously known Hebrew text of the Old Testament.

Were the scrolls left in the caves by a Jewish community living near the Dead Sea or, perhaps, by Jews fleeing Roman troops in the first century AD? We may never know for sure.


Both Men And Women Have Worn High Heels Throughout History

Could you imagine being a soldier riding a horse into battle wearing a pair of stilettos? As crazy as that may sound, it’s nearly historically accurate – except stilettos weren’t invented for another 1,000 years. 

Though they are more commonly worn by women today, high heels were originally made for men. High heels have enjoyed a largely unisex appreciation spanning many centuries and only became female coded in the last 300 years. Throughout history, the public opinion on how these fashionable yet painful shoes should look and feel and who should wear them has vacillated. 

High heels are an evocative symbol of power today. While these elements have remained consistent dating back to their early days, they also represented many more things: independence, social standing, self-importance, masculinity, and strength. Heel wearers were lauded for their fashion sense and despised for their perceived arrogance. 

As frivolous as dress shoes might seem, the origin of high heels is a microcosm of Western gender relations throughout the last millennium.

900s: High Heels Are Used In Horseback-Riding Cultures To Keep Feet In Stirrups

Photo: David Roberts/Wikimedia Commons/Public Domain

The first known high heel was worn by Persian men in the 10th century. They were neither decorative nor stylish, but they served a utility purpose: gripping the stirrups as they rode their horses.

This provided better control and the ability to ride closer to the horse. Heels were especially useful during wartime as the added control allowed the rider to remain steady on the horse and keep his hands free to access and deploy his weaponry. 

1500s: High Heels Are Worn By Courtesans

Photo: Maurice Quentin de La Tour/Wikimedia Commons/Public Domain

High heels were especially popular among one specific group throughout the 16th century: courtesans. The highest caste of harlots, courtesans were the predecessor to the high-end escort. 

They enjoyed privileges that were not available to most other women, let alone other workers like them. They were allowed to enter libraries and keep company with high ranking men. They were known to smoke, drink, and wear high heels to appear “elevated” above other women, and also because the men enjoyed what they saw. 

They were the women who commonly wore the dramatically high heel, often using male servants and noblemen as a human crutch. 

1500s: Aristocratic Women Wear Heels To Indicate Status

Reconstruction of an Venitian chopine, after models dating from 1500 to 1600. On display at the Shoe Museum in Lausanne.

The chopine platform shoe popular with women throughout the 16th century took heel wearers to unbelievable heights, with some shoes clocking in at over 22 inches tall. To keep from falling over, the aristocratic women would often use their maids as a crutch.

This was, understandably, a tremendous public health hazard. Many thought the bodily damage and potential miscarriage was worth it; and while no one could see the shoes, the real marvel was in the dazzling length of the skirt. The long skirts were meant to display wealth, as onlookers were scandalized by how much the extra fabric must have cost. 

Despite the finer shape of the heel, the high heel that followed the chopine was actually more balanced. 

1600s: Persian Migrants Bring Heels To Europe, Where Men Wear Them To Appear More Formidable

Photo:  Ninara/Wikimedia Commons/CC BY-SA 2.0

At the turn of the 17th century, Persian Shah Abbas I sojourned to Europe to seek diplomatic assistance in defeating the Ottoman Empire. Abbas and his entourage visited Russia, Germany, and Spain, resulting in a boom of interest in Persian goods and aesthetics. Aristocratic men quickly adopted the high-heeled shoe, valuing its projection of virile masculinity.

Whereas the Persian soldiers and noblemen used heels out of necessity, they were initially considered formidable and donned for their appearance. 

1700s: King Louis XIV Introduces High Heels With Red Soles To The French Court

Photo: Hyacinthe Rigaud/Wikimedia Commons/Public Domain

The first half of the 18th century was the peak of the men in heels movement. Louis XIV of France, also known as the Sun King, reigned over France for 72 years. Standing at 5’4,” King Louis was rather arrogant, to say the least. The Sun King moniker came from his deeply held belief that he was the center of the universe, and that France revolved around him.

Since Louis was a statistically below-average height man, and fancied himself as monumentally important, he was all about a high heel. He was known for the emblematic red-soled heel, predating Christian Louboutin’s red-bottom heels by 200 years. 

Plebeians were allowed to emulate him by wearing high heels, but only those in his court were permitted to wear the red soles. Doing so without prior authorization was grounds for punishment and being thrown out of court.

1700s: Men’s Heels Become More Broad And Sturdy, While Women’s Become More Decorative

Photo: Jean Francois de Troy/Wikimedia Commons/Public Domain

After a few centuries of a uniformly chunky heel, the design split off into distinct gendered categories. Returning to the original utility purpose of the shoe, the men’s heel became more broad and thick. In contrast, the women’s heel became more tapered and served as a decorative garment. This shift would signal the impending end of society’s acceptance of the unisex high heel.

Mid-1700s: Heels Are Perceived As ‘Frivolous’ And Are Worn Exclusively By Women

Photo: LACMA/Wikimedia Commons/Public Domain

Despite the emergent popularity of high heels at the turn of the 18th century, they were quickly dismissed as frivolous women’s footwear. The design continued to skew more and more dainty and tapered over time, more closely resembling the thin heeled shoe that is worn today.

The purpose for wearing the high heel was not for emphasizing the shape of the wearer’s legs but rather the smallness of her feet. Skirts were still too long to show the contour of the foot, and small feet were a desirable feminine trait at the time. 

Late 1700s: Heels Go Out Of Style Following The French Revolution

Photo: Unknown/Wikimedia Commons/Public Domain

Soon after the gendered divide that diminished the widespread appeal of the high heel, one historical event in 1789 ended public interest in the shoes entirely. 

The French Revolution was a people’s movement that sought to do away with the aristocracy. Garb that emphasized social status became undesirable, and high heels became passe and undemocratic. The shoes were already widely dismissed for their “irrationality and superficiality,” so they were not missed. 

Mid-1800s: Photography And Adult Entertainment Reintroduce High-Heeled Fashion

Photo: French Walery/Wikimedia Commons/Public Domain

Although heels were considered irrational and superficial at the end of the 18th century, they found popularity elsewhere as men began to enjoy the effect a heel had on the shape of a woman’s posterior. 

The advent of photography in the 19th century was monumental for countless reasons, but it also ushered in the renaissance of the high-heeled shoe. Adult postcards featuring women in heels became very popular in France, and the rest of Europe and America soon followed. 

Effectively, the invention of print adult entertainment was directly responsible for the high heel comeback, cementing society’s correlation between sexuality and high heels. 

1940s: Pin-Up Girl Posters Correlate High Heels With Female Sexuality

Photo: Alfred T. Palmer/Flickr/Public Domain

The progression of adult postcards led to the widely popular pin-up genre. Exceptionally tall, thin, and with sharp heels, these provocative shoes were allowed to be more visually intriguing and less structurally sound as the models only needed to pose in them for a few minutes at a time.

The pin-ups were especially popular in the men’s barracks throughout WWII, which inadvertently caused an innovative shake up that would change the shape of the heel industry forever. 

1945: The Stiletto Is Invented And Becomes A Women’s Fashion Staple Following WWII

Photo: Arroser/Wikimedia Commons/CC BY-SA 3.0

In the 1950s, high heel technology allowed for a new type of heel that was thinner, sharper, and more chic than ever before. Up until this point, heels were typically made out of wood, so they could only be carved as thin as was structurally sound. Once shoemakers began using steel to create the structure of the heel, they could be much thinner and still safely support the wearer’s weight, and thus the stiletto heel was born. 

The stiletto heel was a small piece of metal attached to the inside of the shoe, allowing the heel to pivot separately from the shoe and allow flexibility for the wearer. The interior steel piece is known as the “shank,” and it is placed on the back of the heel, rather than in the middle like the original style of heel. 

Late 1900s: Heels Remain Popular, But Designs Become More Casual And Comfortable

Photo: Pudsly/Wikimedia Commons/CC BY 3.0

2010s: As Drag Culture Goes Mainstream, Men Are Wearing Heels Once Again

Photo: DVSROSS/Flickr/CC BY 2.0

While performative cross-dressing has existed for centuries, our current understanding of drag culture quietly progressed in the shadows all throughout the 20th century. It has surged wildly in popularity throughout the early 21st century.

This newfound acceptance of female impersonation and gender fluid performers has made it fashionable once more for men to wear high heels, and this effect has spread beyond the realm of drag performance. This is not to say that the average guy on the street is wearing eight-inch stilettos, but tolerance for gender experimentation has blurred the lines of what clothing is “female” or “male.”

Time Immemorial: Stories Like ‘Cinderella’ Reinforce The Notion Of Heels As A Status Symbol

Photo: Oliver Herford/Wikimedia Commons/Public Domain

The status and desirability of heels has been primarily governed by class signaling and sociopolitical events. While there are many intricate and discrete factors throughout history, many of those dynamics can be best understood through allegorical stories about coveting the high heel.

One example, and possibly the oldest, is Cinderella. Stories using the “slipper test” plot device have been told since the first century in Egypt, and many individual cultures have their own retelling of this type of story. In the case of the present-day Cinderella, the glass slipper represents access to a higher social class. Without it, she appears normal, dowdy, and as a laborer. Due to her beauty and virtuous neighbor, however, she is rewarded by being innately worthy of wearing the shoe. 


Jack Russell – The Man And The Dog

Amongst Dartmouth’s many famous historical figures is one Reverend John Russell, who ‘created’ the Jack Russell hunting dog. 

John, also known as Jack, Russell was born on 21 December 1795 in Dartmouth, the eldest son of John Russell and wife Nora Jewell. He lived at Sandhill House.

He was educated at Plympton Grammar School, Blundell’s School in Tiverton and Exeter College, Oxford. Following his university days in Oxford, he returned to the county to work as a Reverend in North Devon.

John came from a hunting family and wanted to find a hard working breed of terrier which could flush out a fox. He was adamant his terriers should not maim or kill the fox. Instead, he wanted them to nip and worry a fox to the point that it would bolt from its den and take its chances above ground.

In 1819, while studying at Oxford, legend has it he spotted a little white terrier with dark tan spots over her eyes, ears and at the tip of her tail, owned by a local milkman. Russell bought the dog on the spot and ‘Trump’ became the first of a line of fox hunting terriers that became known as Jack Russell Terriers.

Russell crossed Trump with a Devon hunt terrier to create the famous Jack Russell breed. She formed the basis for his breeding programme and by the 1850s the dogs were recognised as a distinct type of Fox Terrier. The dogs were well suited for digging out foxes with the shortness and strength of their legs.

The Reverend was a founding member of The Kennel Club. He helped to write the breed standard for the Fox Terrier (smooth) and became a respected judge.

Reverend Russell was vicar of St James Church, Swimbridge, near Barnstaple, for 40 years from 1832. It’s said his sermons were very brief by Victorian standards because his hunting horse was usually saddled up and waiting for him in the churchyard.

In 1836 he married Penelope Incledon-Bury, daughter and co-heiress of Vice Admiral Richard Incledon-Bury, of the Royal Navy and Lord of the Manor of Colleton, Chulmleigh. Russell is said to have had expensive sporting habits, both on and off the hunting field which drained the substantial resources of his heiress wife and left the estate of Colleton in poor condition.

Russell died in 1883 and his body is buried in the churchyard at Swimbridge. The village pub was renamed the ‘Jack Russell Inn’ in his honour. The pub sign is a reproduction of a painting of Trump, which was commissioned by the then Prince of Wales, later King Edward VII. The original still hangs at Sandringham.

The Jack Russell Terrier Club of Great Britain was established in 1974 as the parent club for the Jack Russell Terrier. The Parson Jack Russell Terrier was recognised in 1990 as a variant of the Fox Terrier. Though these different builds are basically variants of the same breed, with the same temperament and behaviour, most of the world now recognises them as separate breeds. And whether just Jack or Parson Jack they’re both named after Dartmouth’s Reverend John Russell for sure.

History of the Jack Russell Terrier

Jack Russell Terriers are a type, or strain, of working terrier; they are not pure bred in the sense that they have a broad genetic make-up, a broad standard, and do not breed true to type.

Jack Russell Terriers are a type, or strain, of working terrier; they are not pure bred in the sense that they have a broad genetic make-up, a broad standard, and do not breed true to type. This is a result of having been bred strictly for hunting since their beginning in the early 1800’s, and their preservation as a working breed since. The broad standard, varied genetic background based on years of restricted inbreeding and wide outcrossing, and great variety of size and type, are the major characteristics that make this strain of terrier known as a Jack Russell such a unique, versatile working terrier.

The Jack Russell Terrier takes it name from the Reverend John Russell who bred one of the finest strains of terriers for working fox in Devonshire, England in the mid-to-late 1800’s. Rev. Russell (1795-1883), apart from his church activities, had a passion for fox hunting and the breeding of fox hunting dogs; he is also said to be a rather flamboyant character, probably accounting for his strain of terrier’s notability and the name of our terrier today. His first terrier, the immortal TRUMP, is said to be the foundation of John Russell’s strain of working terriers.

Everything about the Jack Russell has fox hunting in mind… coloring, conformation, character, and intelligence. The body is compact, of totally balanced proportions, the shoulders clean, the legs straight, and most importantly, a small chest (easily spannable by average size hands at the widest part behind the shoulders). The Jack Russell must also be totally flexible, allowing him to maneuver underground. This conformation allows the terrier to follow his quarry down narrow earths. The fox is a good model for the Jack Russell-where the fox can go, so must the terrier. Although originally bred for fox hunting, the Jack Russell is a versatile working terrier to a variety of quarry including red and grey fox, raccoon and woodchuck.

John Russell maintained his strain of fox terriers bred strictly for working, and the terrier we know of today as the Jack Russell is much the same as the pre-1900 fox terrier. The Jack Russell has survived the changes that have occurred in the modern-day Fox Terrier because it has been preserved by working terrier enthusiasts in England for more than 100 years; it has survived on its merits as a worker. It is the foremost goal of the JRTCA that the Jack Russell continues in that tradition.

Opposed to Kennel Club Recognition

The Fox Terrier, accepted as a kennel club breed in the late 1800’s, has undergone many conformational changes as a result of the whims of the show ring, resulting in today’s Modern Fox Terrier. Conformational changes such as a deep chest, long, narrow head structure, and extremely straight shoulders make it very unlikely that a fox terrier of today’s standard could follow a fox into a shallow earth, even if the instinct to do so remained. John Russell maintained his strain of fox terriers bred strictly for working, and the terrier we know of today as the Jack Russell is much the same as the pre-1900 fox terrier. It is interesting to note that John Russell was one of the original founders of England’s Kennel Club in 1873; in 1874, he judged Fox Terriers in the first Kennel Club sanctioned show in London. While he remained a Kennel Club member for the rest of his life, he did not exhibit his own dogs.

There has been a great increase in the conformation showing of Jack Russell Terriers in recent years. Conformation exhibiting has been very effective in the U.S. in promoting correct conformation according to the JRTCA breed standard, thereby improving the quality of the breeding stock in this country.

However, while showing is beneficial to the breed in that respect, the JRTCA designs its trials to keep the working aspects of the terrier in the forefront. The highest awards presented to a terrier by the JRTCA are its working awards; the Natural Hunting Certificate, the Bronze Medallion for Special Merit in the Field, and the Working Achievement Award for Continued Field Service.

The JRTCA National Trial Conformation Champion is selected from the JRTCA Working Terrier Division of the National Trial; all entries have proven their working ability to having earned at least one Natural Hunting Certificate in the field. JRTCA sanctioned conformation judges are required to have an in-depth, first-hand knowledge of terrier work, and understand the importance of the physical characteristics necessary for a terrier to be useful for the work he was bred to do. These judges are required to work their terriers in the field.


Buddhism 101: Schools of Tibetan Buddhism

Nyingma, Kagyu, Sakya, Gelug, Jonang, and Bonpo

Buddhism first reached Tibet in the 7th century. By the 8th-century teachers such as Padmasambhava were traveling to Tibet to teach the dharma. In time Tibetans developed their own perspectives and approaches to the Buddhist path.

The list below is of the major distinctive traditions of Tibetan Buddhism. This is only a brief glimpse of rich traditions that have branched into many sub-schools and lineages. 


A monk performs a sacred dance at Shechen, a major Nyingmapa monastery in Sichuan Provinc, China. © Heather Elton / Design Pics / Getty Images

Nyingmapa is the oldest school of Tibetan Buddhism. It claims as its founder Padmasambhava, also called Guru Rinpoche, “Beloved Master,” which places its beginning in the late 8th century. Padmasambhava is credited with building Samye, the first monastery in Tibet, in about 779 CE.

Along with tantric practices, Nyingmapa emphasizes revealed teachings attributed to Padmasambhava plus the “great perfection” or Dzogchen doctrines.


Colorful paintings decorate the walls of Drikung Kagyu Rinchenling monastery, Kathmandu, Nepal. © Danita Delimont / Getty Images

The Kagyu school emerged from the teachings of Marpa “The Translator” (1012-1099) and his student, Milarepa. Milarepa’s student Gampopa is the main founder of Kagyu. Kagyu is best known for its system of meditation and practice called Mahamudra.

The head of the Kagyu school is called the Karmapa. The current head is the Seventeenth Gyalwa Karmapa, Ogyen Trinley Dorje, who was born in 1985 in the Lhathok region of Tibet.


A visitor to the main Sakya Monastery in Tibet poses in front of prayer wheels. © Dennis Walton / Getty Images

In 1073, Khon Konchok Gyelpo (1034-l102) built Sakya Monastery in southern Tibet. His son and successor, Sakya Kunga Nyingpo, founded the Sakya sect. Sakya teachers converted the Mongol leaders Godan Khan and Kublai Khan to Buddhism. Over time, Sakyapa expanded to two subsects called the Ngor lineage and the Tsar lineage. Sakya, Ngor and Tsar constitute the three schools (Sa-Ngor-Tsar-gsum) of the Sakyapa tradition.

The central teaching and practice of Sakyapa is called Lamdrey (Lam-‘bras), or “the Path and Its Fruit.” The headquarters of the Sakya sect today are at Rajpur in Uttar Pradesh, India. The current head is the Sakya Trizin, Ngakwang Kunga Thekchen Palbar Samphel Ganggi Gyalpo.


Gelug monks wear the yellow hats of their order during a formal ceremony. © Jeff Hutchens / Getty Images

The Gelugpa or Gelukpa school, sometimes called the “yellow hat” sect of Tibetan Buddhism, was founded by Je Tsongkhapa (1357-1419), one of Tibet’s greatest scholars. The first Gelug monastery, Ganden, was built by Tsongkhapa in 1409.

The Dalai Lamas, who have been spiritual leaders of the Tibetan people since the 17th century, come from the Gelug school. The nominal head of Gelugpa is the Ganden Tripa, an appointed official. The current Ganden Tripa is Thubten Nyima Lungtok Tenzin Norbu.

The Gelug school places great emphasis on monastic discipline and sound scholarship.


Tibetan monks work on creating an intricate sand drawing, known as a mandala, at the Broward County Main Library February 6, 2007 in Fort Lauderdale, Florida. Joe Raedle / Staff / Getty Images

Jonangpa was founded in the late 13th century by a monk named Kunpang Tukje Tsondru. Jonangpa is distinguished chiefly by kalachakra, its approach to tantra yoga.

In the 17th-century the 5th Dalai Lama forcibly converted the Jonangs into his school, Gelug. Jonangpa was thought to be extinct as an independent school. However, in time it was learned that a few Jonang monasteries had maintained independence from Gelug.

Jonangpa is now officially recognized as an independent tradition once again.


Bon dancers wait to perform at the Masked dancers at Wachuk Tibetan Buddhist monastery in Sichuan, China. © Peter Adams / Getty Images

When Buddhism arrived in Tibet it competed with indigenous traditions for the loyalty of Tibetans. These indigenous traditions combined elements of animism and shamanism. Some of the shaman priests of Tibet were called “bon,” and in time “Bon” became the name of the non-Buddhist religious traditions that lingered in Tibetan culture.

In time elements of Bon were absorbed into Buddhism. At the same time, Bon traditions absorbed elements of Buddhism, until Bonpo seemed more Buddhist than not. Many adherents of Bon consider their tradition to be separate from Buddhism. However, His Holiness the 14th Dalai Lama has recognized Bonpo as a school of Tibetan Buddhism.


  • O’Brien, Barbara. “Schools of Tibetan Buddhism.” Learn Religions, Feb. 11, 2020,

Gay History: Taboo or Not Taboo, The Fashions Of Leigh Bowery

Reading The Face magazine in early 1984 I was overwhelmed by a double-page spread entitled The New Glitterati featuring Leigh Bowery photographed in his ‘Paki from outer space’ look. His face was camouflaged in bright Plasticine-blue make-up, his head adorned with a mock leather military cap emblazoned in sequins and badges, while his entire body dripped with jewels, piercing and lots of body glitter. He wore a masterful creation – a bright green velour top with plunging neckline, fitted with this amazing red, asymmetrical zipper. Bowery looked like some exotic fashion god, a contemporary Krishna put through the blender with an extraterrestrial. It was kitsch and outrageous. It was inspirational. Did Jean Paul Gaultier, John Galliano or Vivienne Westwood design these clothes? Intrigued, I wanted to know more. The writer of the article observed:

One glance at these blinding photographs reveals why designer and jovial poseur Leigh Bowery – 22 years old, Abba addict and unrepentant champion of platform shoes – chose to leave his native Australia and cultivate his own outrageous style on the fringes of London’s club scene. They just didn’t understand him in the outback.1

 I sighed … finally, an Australian designer had made it into the pages of this influential style journal. Bowery did more for Australian fashion in two pages than had occurred in the past century … and the best was yet to come.

An extra extrovert, the ultimate spectacle, the fashionable performer, the grand poseur, Bowery communicated through his blatant sexuality, his extreme physical exaggerations, and his outrageous dress codes. Bowery was not simply dressing up; it was his lifestyle and commentary on the mundane, a joke about appearance. His collections or ‘looks’ were based on himself manipulating his body with clothing and make-up. Working outside the comfort zone, he developed a clothing aesthetic that few would dare follow. Original, provocative, evolutionary; Bowery manipulated clothing to totally change one’s appearance, like a form of cosmetic surgery. ‘In an age when pop stars, actors, designers – those who traditionally dictated stylistic trends – are almost indistinguishable in their uniformity and blandness, Leigh Bowery stands out like an erection in a convent.’2

Leigh Bowery’s place in fashion, art and popular culture is seditionary. The fashions he created were not worn on the streets, very rarely seen in daylight, or generated for mass consumption. His dress style hailed from club culture,3 and the concepts of dressing up and masquerade.

Bowery was born in Sunshine – a baby-boomer, semi-industrial suburban sprawl, west of Melbourne – on 26 March 1961.4 He attended Sunshine Primary School and later, Melbourne High School. He passionately wanted to be a fashion designer and studied for two years at Royal Melbourne Institute of Technology (RMIT) before becoming disillusioned by the restrictions imposed by formal training. Fuelled by the visual culture of style magazines, Bowery was attracted to London by the new romantic/blitz movement of the early 1980s where fashion, art and music were fused under the glamorous spotlight of the nightclub scene. Pop stars and bands such as David Bowie, Steve Strange, Spandau Ballet, Duran Duran and Culture Club influenced style. This was the breeding ground for the most creative, experimental and sexually charged clothing. Clubs were the stage for dressing-up; men and women wearing outlandish garments, big hairstyles and faces plastered with make-up. Gender boundaries were easily challenged in this world and androgynous looks abounded.5 Pretty clothes and special effects like frilly shirts, kilts, lace, satin and make-up were all worn by men – gay or straight – ‘it was almost like a love affair with yourself’.6 In the 1970s David Bowie, especially with his Ziggy Stardust persona, had ‘invented a whole language of art posing, he[‘d] invented the language to express gender confusion’.7 Fantasy and escapism were attractive vehicles to express individuality through clothing, make-up and hair. The key rationale for clubbers was attracting attention and being the centre of attention: dressing up was very competitive.

In 1984 the relaunch of London Fashion Week provided a platform for British designers to show their wares. This event, combined with vibrant street styles and the underground club scene, spawned the most creative and eccentric clothes, making London a potent source for world fashion trends. It nurtured and gained recognition for the fashion designers Vivienne Westwood, John Galliano, and in recent times, Alexander McQueen and Hussein Chalayan. However, it was clubs that provided the major venue, market and audience for generating clothing that was beyond one’s wildest dreams/nightmares.

Without working from ‘classics’, referencing the cultures of the world, fashions of previous decades or centuries, emulating a favourite designer or the current pages of French Vogue, Bowery was inspired to create something that bore no resemblance to anything. Early in his career he had begun to despise fashion because it was too restrictive and conservative. Bowery’s looks were incredibly fresh and up-to-the-minute fashionable. Making items over a short period of time, for a special event or club night out, his garments were a spontaneous response to the immediacy of his environment.

I believe that fashion (where all the girls have clear skins, blue eyes, blond blow-waved hair and a size ten figure and where all the men have clear skins, moustaches, short blow-waved hair and masculine physique and appearance) STINKS. I think that firstly individuality is important, and that there should be no main rules for appearance and behaviour. Therefore I want to look as best I can, through my means of individuality and expressiveness.8

 Bowery’s costume designs were complex, technically difficult and fantastic. By 1985 they bore no similarity to the catwalk or street styles of London or the rest of the world. Vivienne Westwood initially was a great inspiration to Bowery, particularly her anti-establishment spirit, her distortion of clothing and body forms, and her design mantra that ‘clothing could be subversive’.9 Bowery garments were worn by performers like Boy George, who recalled:

I was dressed like a Jewish bathroom, gold chains, safety-pins, badges and buckles, champagne corks and tassels. The costumes designed by Judy Blame and Leigh Bowery were meant to hide my expanding girth, although it was hard to look thin in an A-line smock with angel-wings jutting out the back.10

 In 1985 Bowery evolved from a fashion designer into an aesthetic revolutionary when he became the public face of the nightclub Taboo. The name said it all. Situated in the Maximus discotheque at Leicester Square, the club was originally staged only once a fortnight. Wearing a different outfit every week, Bowery was the main attraction. Some of his kitsch looks included a

short pleated skirt, with a glittery denim, Chanel-style jacket teamed with scab-make-up and a cheap, plastic, souvenir policeman’s hat11 … yellow gingham jacket printed with red spots with matching shirt and face12 … a denim jacket covered with Lady Jayne hair slides and his bald head decorated with dribbled dyed glue. The club’s dress code was ‘dress as though your life depends on it, or don’t bother’.13

 The taste for the ridiculous, and his constantly changing looks, ensured that when Bowery entered the club, everyone else looked boring. Taboo was not an exclusively gay club, however, it attracted a large gay following lured by the opportunity to be part of the outrageous fashion scene. The Taboo nightclub symbolised the excesses of the 1980s, looking fantastic was taken to extremes. Unfortunately, it closed after a year due to drug soliciting. Boy George has turned this club phenomenon into the Broadway musical Taboo.14

Without the assistance of the slick, branded imagery associated with major fashion labels and huge marketing budgets, Bowery’s fame and reputation rested solely on being seen. His creations were documented and celebrated in the London style magazines, i-D, The Face and Blitz; his antics were reviewed in the club pages, communicating his visual language. Promoting the fringe, these magazines gave copy and editorial to the young and original, promoting an ideas culture that supported independent design.15 In Melbourne the enclave of independent fashion designers and boutiques situated in Greville and Chapel streets would proudly display the latest edition of The Face or i-D in the shop window, and they were indispensable reading in every hairdressing salon. Many Australians followed Bowery’s career and lifestyle through this source, even the interior of his flat that he shared with Trojan16 was featured – walls covered in Star Trek wallpaper, clumps of plastic flowers decorating the skirting, and UV-lit. Interviewer: ‘Does the interior of your home match the interior of your mind?’ Leigh and Trojan: ‘Yes, it’s an extension of what we wear.’17

Bowery was a great fan of the American film director John Waters whose movies had a profound effect on the development of his dress aesthetic, his humour and body politics. Waters pushed the boundaries of taste, making films with outrageous plots and an offbeat humour merged with an unseemly collage of characters, scenery and costumes. This ‘trash’ aesthetic is best portrayed in the film Pink Flamingos, 1972, about the search for the filthiest person alive, which was Bowery’s favourite movie.18 The principle actor, Harris Glen Milstead, working under the name Divine and affectionately known as the Queen of Sleaze, and a cult figure in his own right, was a cross-dresser. His huge physique was featured wearing figure-hugging gowns or sack dresses. The representation of the ‘fashionable’ unfashionable person was meticulously crafted, with huge, bouffant hairstyles and highly stylised make-up, reminiscent of the Kabuki theatre, accompanying Divine’s extensive wardrobe. This image of alternative, Baltimore glamour was one Bowery chose to follow.

The magnitude of Bowery’s costumes is unforgettable, both in physical scale and psychological effect. The Metropolitan, c. 1988 – christened by Nicola Bowery in reference to its most famous appearance at The Metropolitan Museum of Art, New York, at the opening of the Lucian Freud retrospective in 1993 – had been worn by Bowery to various events (figs 1–4,8)19 Like many of his fashioned items, The Metropolitan was a work in progress that would simply be upgraded or reaccessorised to suit the occasion. This dress reads like a masculine ballgown, it is not intended to be drag or transvestite costume. Gender bending was common in the 1980s; the most infamous example was the skirted male suit produced by Jean Paul Gaultier in 1985. It was an attempt to blur the distinctions between male and female dress, however, the translations into mainstream fashion were commercially unsuccessful.20

Forged from a garish floral sateen, The Metropolitan boy’s dress has a square, flat bodice with a right breast pocket and open underarms. The bodice extends into a full-face mask with cut-out holes for the eyes and mouth. The mask was a device Bowery employed to prevent ruining his clothes from greasy make-up stains; in Metropolitan he could apply make-up only to his eyes and lips. Using extensive metreage, the enormous skirt appears to hover, supported by a series of taffeta and tulle petticoats, producing a fashionable silhouette reminiscent of 1950s haute couture. Bowery was serious about the history of fashion and his private library contained many books relating to designers, including the major French couturiers Cristobal Balenciaga and Christian Dior, the pre-eminent role models who practised the very expensive, drop-dead-gorgeous philosophy of French high fashion. Restricted by money, Bowery still participated in fashion’s excesses by relying on inventive detailing and utilising entire bolts of inexpensive fabrics for the production of major works. His selection of ‘tasteless’, out of date, patterned prints purchased from discounted fabric shops defiantly challenged the grand-ballgown tradition. In this case, the floral motifs are enlivened with clusters of blue sequins painstakingly sewn on individually by Nicola Bowery in a mock Dior/Balenciaga style.21

Bowery was a professional dressmaker; he drafted patterns, cut fabric and sewed. His garments were solid constructions, strong enough to survive the rigours of clubbing. When he lived with the corsetiere Mr Pearl, they would purchase second-hand corsets, pull them apart and remake them to learn the exacting construction techniques.22 

The Metropolitan is a total disguise, providing an obvious reference to traditions of fancy dress and masquerade,23 a perfect choice for a gallery opening depicting the wearer’s naked portraits! The art world was familiar territory; Bowery visited museums and he avidly collected art books and catalogues. Bowery even played the role of an art exhibit in 1988, performing at Anthony D’Offay’s London gallery wearing a different outrageous, tasteless, memorable look each day. Bowery desperately wanted his artware to be acknowledged by this elite. Nicola Bowery intentionally named this costume The Metropolitan in the hope that it would enter that prestigious collection.

In a rather perverse way, Bowery loved fashion protocols and niceties; wearing gloves, hats, belts and shoes. Gloves were a particular favourite and an expensive item to buy, so he would often steal these to complete his ensemble. Like the leader of a militant fashion army, Bowery walked into the Metropolitan wearing a floral dress with a Kaiser helmet, a pair of khaki camouflage-print gloves, a leather neck-and-waist belt and a pair of candy-pink platform shoes, and literally invaded the space. In all the fashion galas and openings held at the Metropolitan, no one had ever seen anything like this. His entrance would have been either very funny or very frightening. Just like a scene from a John Waters movie, he stole the show. Bowery’s exposure and main recognition in the mainstream art world came through the hauntingly beautiful, naked portraits of him painted by Lucian Freud. With his curvaceous, plump body and luminescent, waxed skin and his un-made-up natural face with pierced cheeks in-filled with clear plastic plugs, this was the Bowery the art-museum world could relate to.

After 1990 Bowery stopped using fancy decorations on his clothing, instead, his work became much more abstract and surreal. During a trip to Japan he had discovered a catalogue of Transformer robots. These sophisticated toys provided a catalyst for Bowery to reconfigure his body and clothing in strange ways: he became a transformer. The Pregnant tutu head, c. 1992, costume is an experiment with scale and form (figs 5-7). Bowery in his performance pieces had already mesmerised his audience with giving birth to Nicola Bowery on stage.24 He was fascinated by the body’s capacity to change shape, and pregnancy was the most obvious example. Bowery’s clothing rituals often involved pain, discomfort and restrictions that produced difficulties with breathing, urinating and mobility. Although not intentionally designed for sadomasochistic pleasures, he applied any device, physical or manufactured, to achieve the masterpieces of his imagination, and this was pleasure enough.

Bowery had already attempted to distort his own body with unorthodox combinations of clothing forms and the deception of make-up. The Pregnant tutu head‘s top has a protruding belly suggesting the silhouette of a pregnant woman and the continuation of the species; it is worn with stretch pants. To continue this exaggerated silhouette and reinforce the symbol of growth, Bowery crafted half-circle, fabric shoes from large pieces of foam rubber covered in brown fabric. The bulbous shoes look ridiculous, like the cartoon models worn by Mickey and Minnie Mouse. The headpiece is formed like a large pompom made from tiers of orange tulle frills zipping up the back; the wearer encapsulated in a puff of fabric. A pair of full-length, dark blue gloves complete this ensemble. Bowery’s 1990s clothing is often visually disturbing, as he experimented with costume freakery.

Since his death in 1994,25 Bowery’s contribution to fashion and style culture has begun to be assessed and acknowledged in wider forums beyond style magazines and the club subcultures. Today, the boy from Sunshine is recognised internationally as a major style icon of the twentieth century, he was ‘surely a predictor of fashion!!!!’.26 Phaidon published The Fashion Book in 1998,27 a gigantic tome devoted to the 500 leading designers who had created and inspired world fashion over the past 150 years. Only three Australians made the final cut: Colette Dinnigan, Akira Isogawa and Leigh Bowery. Bowery’s recognition came not from commercial success or as a known fashion brand, but from his creativity and originality, described in the book as ‘part voodoo part clown’. He was indexed as an icon alongside the likes of David Bowie and Johnny Rotten. Bowery was not about setting fashionable trends, however, the influence of his creations is seen in the work of designers such as Vivienne Westwood, Alexander McQueen and Hussein Chalayan, and in the conceptual approach of much contemporary fashion, reinforcing the ‘continuing importance of this experimental dimension of fashion culture’.28

An exhibition of Leigh Bowery’s work was staged in Australia in 1999: Leigh Bowery: Look at Me at the RMIT Gallery, Melbourne, curated by Robert Buckingham and designed by Randal Marsh, included original costumes, videos and photographs. For many Australians this was their primary exposure to Bowery’s work in a local context,29 and certainly, to see actual costumes was provoking. Unexpectedly, these crafted, one-off garments designed for club wear and art performance were neither pretty, fashionable nor utilitarian. Instead, they had the power and capacity to confront issues relating to appearance, sex and politics. For many viewers this experience was a revelation. Bowery’s genre was as provocateur. The National Gallery of Victoria acquired two costumes from this exhibition and it is the only gallery in the world (at the time of writing) to represent his costumes.30 Bowery is finally an official part of Australia’s material culture.

Perhaps the best recognition and understanding of Bowery’s work is the inclusion of The Metropolitan in the inaugural hang at the Ian Potter Centre: NOV Australia, at a gallery devoted to Australian art. ‘Leigh would be ecstatic’ if he knew he was part of a major public collection.31

Reference & Notes

This article focuses only on aspects of Bowery’s clothing design, in particular, the examination of his work in a broader fashion context, and does not attempt to cover his extensive repertoire, particularly his performance work or collaboration with the Michael Clarke Ballet Troupe.

1     L. White, ‘The new glitterati’, The Face, no. 48, April, 1984, p. 56. For a discussion of influence and role of the style magazine and fashion journalism see C. McDermott, Streetstyle: British Design in the 80s, New York, 1987, pp. 81–88; for an examination of the nature of fashion journalism see A. McRobbie, British Fashion Design: Rag Trade or Image Industry?, London, 1998, pp. 151–174.

2     A. Sharkey, ‘The undiluted Leigh Bowery’, i-D, no. 42, The Plain English Issue, June 1987, p. 63.

3     ‘Because clubbing and raving are done by a narrow segment of the population after most people go to bed, the scale of the social phenomenon often goes unnoticed.’ S. Thornton, Club Cultures, Cambridge, 1995 p. 14.

4     For a complete account of Bowery’s life, see S. Tilley, The Life and Times of an Icon, London, 1997; and R. Violette (ed.), Leigh Bowery, London, 1998.

5     S. Cole, Don We Now Our Gay Apparel, New York, 2000, p. 158.

6     ibid., p.159.

7     J. Savage, Time Travel, Pop, Media and Sexuality 1976-96, London, 1996, p. 112.

8     Tilley, p. 97.

9     McDermott, p. 26. Vivienne Westwood collaborated with Malcolm McLaren from 1971 to 1983 before embarking on a solo career.

10     B. George with S. Bright, Take It Like a Man, London, 1995, p. 521.

11     Tilley, p. 57.

12     ibid., p.61.

13     ibid., p.53.

14     Music and lyrics by Boy George, based on the story by Mark Davies. Directed by Christopher Renshaw. Matt Lucas, Boy George, and most recently, Marilyn, have played the role of Bowery.

15     T. Jones (ed.), Fashion and Style: The Best from 20 Years of i-D, Koln, 2001.

16     Pseudonym used by Gary Barnes, 1966–86, who described himself as an ‘artist and prostitute’. Encouraged by Bowery, he painted confronting works in a Daliesque/naive style. They lived together for several years, Bowery dressing him in his latest fashion designs.

17     F. Russell-Powell, ‘Penthouse’. i-D, The Inside Out Issue, no.19, October 1984, p. 8.

18     Nicola Bowery, discussion with the author, 23 May 2002.

19     N. Bowery, discussion, 17 July 2002. The Metropolitan was purchased from Bowery’s widow, Nicola Bowery. She generously donated Pregnant tutu head to the National Gallery of Victoria in 1999.

20     S. Mower, ‘Gaultier’, Arena, London, July/August 1987, p.85. Gaultier produced only 3000 suits worldwide.

21     N. Bowery, discussion, 23 May 2002.

22     N. Bowery, discussion.

23     See A. Ribeiro. ‘Fantasy and fancy dress’, Dress in Eighteenth Century Europe, New Haven, 2002, pp. 245–282. The custom of masking or disguise goes back to antiquity. ‘The masquerade provided opportunities for role-playing and subversion of propriety in defiance of the conventions of society’ Ibid., p. 245.

24     For photographs relating to Leigh Bowery’s performances, from Wigstock to his pop group Minty, and his performances with the Michael Clarke Ballet Troupe, see Violette.

25     ‘The fabulous Leigh Bowery passed away on New Year’s Eve, 1994, and London lost another mirror ball. No one knew Leigh had Aids because he didn’t want them to. He said, “I want to be remembered as a person with ideas, not Aids.”’ George with Bright, p. 566.

26     Walter Von Beirendonck, letter to the author, 14 June 2002.

27     The Fashion Book, London, 1998. See Leigh Bowery entry, p.70; Colette Dinnigan, p. 135; Akira Isogawa, p.225.

28     D. Gilbert, ‘Urban outfitting’, in Fashion Cultures: Theories, Explorations and Analysis, eds S. Bruzzi & P. Gibson, London, 2001, p. 9.

29     In 1987 Bowery performed with the Michael Clarke Ballet Troupe at the Melbourne Town Hall, horrifying his parents and most of the audience with his obscene acts.

30     Nicola Bowery, discussion, 17 July 2002.

31     Nicola Bowery, discussion.

Buddhism 101: What Is a Buddha? Who Was the Buddha?

Sami Sarkis / Photographer’s Choice RF / Getty Images

The standard answer to the question “What is a Buddha?” is, “A Buddha is someone who has realized the enlightenment that ends the cycle of birth and death and which brings liberation from suffering.”

Buddha is a Sanskrit word that means “awakened one.” He or she is awakened to the true nature of reality, which is a short definition of what English-speaking Buddhists call “enlightenment.”

A Buddha is also someone who has been liberated from Samsara, the cycle of birth and death. He or she is not reborn, in other words. For this reason, anyone who advertises himself as a “reincarnated Buddha” is confused, to say the least.

However, the question “What is a Buddha?” could be answered many other ways.

Buddhas in Theravada Buddhism

There are two major schools of Buddhism, most often called Theravada and Mahayana. For purposes of this discussion, Tibetan and other schools of Vajrayana Buddhism are included in “Mahayana.” Theravada is the dominant school in southeast Asia (Sri Lanka, Burma, Thailand, Laos, Cambodia) and Mahayana is the dominant school in the rest of Asia.

According to Theravada Buddhists, there is only one Buddha per age of the earth, and ages of the earth last a very long time.

The Buddha of the current age is the Buddha, the man who lived about 25 centuries ago and whose teachings are the foundation of Buddhism. He is sometimes called Gautama Buddha or (more often in Mahayana) Shakyamuni Buddha. We also often refer to him as ‘the historical Buddha.’

Early Buddhist scriptures also record names of the Buddhas of earlier ages. The Buddha of the next, future age is Maitreya.

Note that the Theravadins are not saying that only one person per age may be enlightened. Enlightened women and men who are not Buddhas are called arhats or arahants. The significant difference that makes a Buddha a Buddha is that a Buddha is the one who has discovered the dharma teachings and made them available in that age.

Buddhas in Mahayana Buddhism

Mahayana Buddhists also recognize Shakyamuni, Maitreya, and the Buddhas of previous ages. Yet they don’t limit themselves to one Buddha per age. There could be infinite numbers of Buddhas. Indeed, according to the Mahayana teaching of Buddha Nature, “Buddha” is the fundamental nature of all beings. In a sense, all beings are Buddha.

Mahayana art and scriptures are populated by a number of particular Buddhas who represent various aspects of enlightenment or who carry out particular functions of enlightenment. However, it’s a mistake to consider these Buddhas as god-like beings separate from ourselves.

To complicate matters further, the Mahayana doctrine of the Trikaya says that each Buddha has three bodies. The three bodies are called dharmakaya, sambhogakaya, and nirmanakaya. Very simply, dharmakaya is the body of absolute truth, sambhogakaya is the body that experiences the bliss of enlightenment, and nirmanakaya is the body that manifests in the world.

In Mahayana literature, there is an elaborate schema of transcendent (dharmakaya and sambhogakaya) and earthly (nirmanakaya) Buddhas who correspond to each other and represent different aspects of the teachings. You will stumble upon them in the Mahayana sutras and other writings, so it’s good to be aware of who they are. 

Amitabha, the Buddha of Boundless Light and the principal Buddha of the Pure Land school.

Bhaiṣajyaguru, the Medicine Buddha, who represents the power of healing.

Vairocana, the universal or primordial Buddha.

Oh, and about the fat, laughing Buddha — he emerged from Chinese folklore in the 10th century. He is called Pu-tai or Budai in China and Hotei in Japan. It is said that he is an incarnation of the future Buddha, Maitreya.

All Buddhas Are One

The most important thing to understand about the Trikaya is that the countless Buddhas are, ultimately, one Buddha, and the three bodies are also our own body. A person who has intimately experienced the three bodies and realized the truth of these teachings is called a Buddha.


  • O’Brien, Barbara. “What Is a Buddha? Who Was the Buddha?” Learn Religions, Feb. 11, 2020,

The Tichborne Dole

The Tichborne Dole is an ancient English tradition still very much alive today. It takes place in the village of Tichborne near Alresford in Hampshire every year on March 25th the Feast of the Annunciation (Lady’s Day) and dates back to the 13th century.

Suffering from a wasting disease which had left her crippled, on her deathbed Lady Mabella Tichborne asked her miserly husband, Sir Roger, to donate food to the needy regularly every year. Her husband was reluctant but made a bizarre agreement as to how much he would give.

Sir Roger agreed to give the corn from all the land which his dying wife could crawl around whilst holding a blazing torch in her hand, before the torch went out. Lady Mabella succeeded in crawling around a twenty-three acre field which is still called ‘The Crawls’ to this day and which is situated just north of Tichborne Park and beside the road to Alresford.

Lady Tichborne charged her husband and his heirs to give the produce value of that land to the poor in perpetuity. But aware of her husband’s miserly character, Mabella added a curse – that should the dole ever be stopped then seven sons would be born to the house, followed immediately by a generation of seven daughters, after which the Tichborne name would die out and the ancient house fall into ruin.

The Tichbourne Dole in 1671

The custom of giving the dole, in the form of bread, on 25th March, Lady Day continued for over 600 years, until 1796, when owing to abuse by vagabonds and vagrants, it was temporarily suspended by order of the Magistrates.

Local folk however, remembered the final part of the Tichborne legend and Lady Tachborne’s curse. The penalty for not giving the dole would be a generation of seven daughters, the family name would die out and the ancient house fall down. In 1803 part of the house did indeed subside and the curse seemed to have been fulfilled when Sir Henry Tichborne who succeeded to the baronetcy in 1821(one of seven brothers), produced seven daughters.

The tradition was hastily re-established and has continued to this day.

Roger, Henry’s nephew, was born before the restoration of the Dole and his younger brother Alfred afterwards. Roger was lost at sea in 1845 and was impersonated two decades later by the unsuccessful Tichborne claimant, Arthur Orton (pictured at the top of the article). Alfred was the only one to survive Lady Tichborne’s curse and thus the Tichborne name did not die out.

The Dole is held every Lady Day, March 25th. The parish priest carries out the traditional Blessing of the Tichborne Dole before the flour is distributed to the local people – only those families in Tichborne, Cheriton and Lane End are entitled to the dole. They receive one gallon of flour per adult and half a gallon per child.

Lady Day itself is celebrated in honour of the Virgin Mary as this day, nine months before Christmas, is the day of the Annunciation from the Archangel Gabriel that she would bear Christ. In the 12th century Lady Day was considered the first day of the year and persisted until the official calendar change of 1752.


The Murderous History Of Bible Translations

The Bible has been translated into far more languages than any other book. Yet, as Harry Freedman reveals, the history of Bible translations is not only contentious but bloody, with many who dared translate it being burned at the stake…

In 1427, Pope Martin ordered that John Wycliffe’s bones be exhumed from their grave, burned and cast into the river Swift. Wycliffe had been dead for 40 years, but his offence still rankled.

John Wycliffe (c1330–1384) was 14th-century England’s outstanding thinker. A theologian by profession, he was called in to advise parliament in its negotiations with Rome. This was a world in which the church was all-powerful, and the more contact Wycliffe had with Rome, the more indignant he became. The papacy, he believed, reeked of corruption and self-interest. He was determined to do something about it.

Wycliffe began publishing pamphlets arguing that, rather than pursuing wealth and power, the church should have the poor at heart. In one tract he described the Pope as “the anti-Christ, the proud, worldly priest of Rome, and the most cursed of clippers and cut-purses”.

In 1377 the Bishop of London demanded that Wycliffe appear before his court to explain the “wonderful things which had streamed forth from his mouth”. The hearing was a farce. It began with a violent row over whether or not Wycliffe should sit down. John of Gaunt, the king’s son and an ally of Wycliffe, insisted that the accused remain seated; the bishop demanded that he stand.

When the Pope heard of the fiasco he issued a papal bull [an official papal letter or document] in which he accused Wycliffe of “vomiting out of the filthy dungeon of his heart most wicked and damnable heresies”. Wycliffe was accused of heresy and put under house arrest and was later forced to retire from his position as Master of Balliol College, Oxford.

Wycliffe firmly believed that the Bible should be available to everybody. He saw literacy as the key to the emancipation of the poor. Although parts of the Bible had previously been rendered into English there was still no complete translation. Ordinary people, who neither spoke Latin nor were able to read, could only learn from the clergy. Much of what they thought they knew – ideas like the fires of hell and purgatory – were not even part of Scripture.

With the aid of his assistants, therefore, Wycliffe produced an English Bible [over a period of 13 years from 1382]. A backlash was inevitable: in 1391, before the Bible was completed, a bill was placed before parliament to outlaw the English Bible and to imprison anyone possessing a copy. The bill failed to pass – John of Gaunt saw to that [in parliament] – and the church resumed its persecution of the now-dead Wycliffe [he died in 1384].

Shorn of alternatives, the best they could do was to burn his bones [in 1427], just to make sure his resting place was not venerated. The Archbishop of Canterbury explained that Wycliffe had been “that pestilent wretch, of damnable memory, yea, the forerunner and disciple of antichrist who, as the complement of his wickedness, invented a new translation of the scriptures into his mother-tongue”.

A page from John Wycliffe’s translation of the Bible into English, c1400. (Photo by Ann Ronan Pictures/Print Collector/Getty Images)

Jan Hus

In 1402, the newly ordained Czech priest Jan Hus was appointed to a pulpit in Prague to minister in the church. Inspired by Wycliffe’s writings, which were now circulating in Europe, Hus used his pulpit to campaign for clerical reform and against church corruption.

Like Wycliffe, Hus believed that social reform could only be achieved through literacy. Giving the people a Bible written in the Czech language, instead of Latin, was an imperative. Hus assembled a team of scholars; in 1416 the first Czech Bible appeared. It was a direct challenge to those he called “the disciples of antichrist” and the consequence was predictable: Hus was arrested for heresy.

Jan Hus’s trial, which took place in the city of Constance, has gone down as one of the most spectacular in history. It was more like a carnival – nearly every bigwig in Europe was there. One archbishop arrived with 600 horses; 700 prostitutes offered their services; 500 people drowned in the lake; and the Pope fell off his carriage into a snowdrift. The atmosphere was so exhilarating that Hus’s eventual conviction and barbaric execution must have seemed an anti-climax. But slaughtered he was, burnt at the stake. His death galvanised his supporters into revolt. Priests and churches were attacked, the authorities retaliated. Within a few short years Bohemia had erupted into civil war. All because Jan Hus had the gall to translate the Bible.

The capture of Jan Hus. Miniature of the ‘Chronicle’ of Ulrich of Richental. Prague, national library of the University. (Photo by Roger Viollet Collection/Getty Images)

William Tyndale

As far as the English Bible is concerned, the most high profile translator to be murdered was William Tyndale. It was now the 16th century and Henry VIII was on the throne. Wycliffe’s translation was still banned, and although manuscript copies were available on the black market, they were hard to find and expensive to procure. Most people still had no inkling of what the Bible really said.

But printing was becoming commonplace, and Tyndale believed the time was right for an accessible, up-to-date translation. He knew he could create one; all he needed was the funding, and the blessing of the church. It didn’t take him long to realise that nobody in London was prepared to help him. Not even his friend, the bishop of London, Cuthbert Tunstall. Church politics made sure of that.

The religious climate appeared less oppressive in Germany. Luther had already translated the Bible into German; the Protestant Reformation was gathering pace and Tyndale believed he would have a better chance of realising his project there. So he travelled to Cologne and began printing.

This, it transpired, was a mistake. Cologne was still under the control of an archbishop loyal to Rome. He was halfway through printing the book of Matthew when he heard that the print shop was about to raided. He bundled up his papers and fled. It was a story that would be repeated several times over the next few years. Tyndale spent the next few years dodging English spies and Roman agents. But he managed to complete his Bible and copies were soon flooding into England – illegally, of course. The project was complete but Tyndale was a marked man.

He wasn’t the only one. In England, Cardinal Wolsey was conducting a campaign against Tyndale’s Bible. No one with a connection to Tyndale or his translation was safe. Thomas Hitton, a priest who had met Tyndale in Europe, confessed to smuggling two copies of the Bible into the country. He was charged with heresy and burnt alive.

Thomas Bilney, a lawyer whose connection to Tyndale was tangential at the most, was also thrown into the flames. First prosecuted by the bishop of London, Bilney recanted and was eventually released in 1529. But when he withdrew his recantation in 1531 he was re-arrested and prosecuted by Thomas Pelles, chancellor of Norwich diocese, and burnt by the secular authorities just outside the city of Norwich.

Meanwhile Richard Bayfield, a monk who had been one of Tyndale’s early supporters, was tortured incessantly before being tied to the stake. And a group of students in Oxford were left to rot in a dungeon that was used for storing salt fish.

Tyndale’s end was no less tragic. He was betrayed in 1535 by Henry Phillips, a dissolute young aristocrat who had stolen his [Phillips’] father’s money and gambled it away. Tyndale was hiding out in Antwerp, under the quasi–diplomatic protection of the English merchant community. Phillips, who was as charming as he was disreputable, befriended Tyndale and invited him out for dinner. As they left the English merchant house together, Phillips beckoned to a couple of thugs loitering in a doorway. They seized Tyndale. It was the last free moment of his life. Tyndale was charged with heresy in August 1536 and burnt at the stake a few weeks later.

William Tyndale being tied to a stake before being strangled and burned to death. (Photo by Hulton Archive/Getty Images)

England was not the only country to murder Bible translators. In Antwerp, the city where Tyndale thought he was safe, Jacob van Liesveldt produced a Dutch Bible. Like so many 16th-century translations, his act was political as well as religious. His Bible was illustrated with woodcuts – in the fifth edition he depicted Satan in the guise of a Catholic monk, with goat’s feet and a rosary. It was a step too far. Van Liesveldt was arrested, charged with heresy and put to death.

A murderous age

The 16th century was by far the most murderous age for Bible translators. But Bible translations have always generated strong emotions, and continue to do so even today. In 1960 the United States Air Force Reserve warned recruits against using the recently published Revised Standard Version because, they claimed, 30 people on its translation committee had been “affiliated with communist fronts”.  TS Eliot, meanwhile, railed against the 1961 New English Bible, writing that it “astonishes in its combination of the vulgar, the trivial, and the pedantic”.

And Bible translators are still being murdered. Not necessarily for the act of translating the Bible, but because rendering the Bible into local dialects is one of the things Christian missionaries do. In 1993 Edmund Fabian was murdered in Papua New Guinea, killed by a local man who had been helping him translate the Bible. In March 2016, four Bible translators working for an American evangelical organisation were killed by militants in an undisclosed location in the Middle East.

Bible translations, then, may appear to be a harmless activity. History shows it is anything but.


Meet the Fantastically Bejeweled Skeletons of Catholicism’s Forgotten Martyrs

Art historian and author Paul Koudounaris elucidates the macabre splendor and tragic history of Europe’s catacomb saints

Saint Coronatus joined a convent in Heiligkreuztal, Germany, in 1676 (Shaylyn Esposito)

Paul Koudounaris is not a man who shies away from the macabre. Though the Los Angeles-based art historian, author and photographer claims that his fascination with death is no greater than anyone else’s, he devotes his career to investigating and documenting phenomena such as church ossuaries, charnel houses and bone-adorned shrines. Which is why, when a man in a German village approached him during a 2008 research trip and asked something along the lines of, “Are you interested in seeing a dilapidated old church in the forest with a skeleton standing there covered in jewels and holding a cup of blood in his left hand like he’s offering you a toast?” Koudounaris’ answer was, “Yes, of course.”

At the time, Koudounaris was working on a book called The Empire of Death, traveling the world to photograph church ossuaries and the like. He’d landed in this particular village near the Czech border to document a crypt full of skulls, but his interest was piqued by the dubious yet enticing promise of a bejeweled skeleton lurking behind the trees. “It sounded like something from the Brothers Grimm,” he recalls. “But I followed his directions—half thinking this guy was crazy or lying—and sure enough, I found this jeweled skeleton in the woods.”

The church—more of a small chapel, really—was in ruins, but still contained pews and altars, all dilapidated from years of neglect under East German Communist rule. He found the skeleton on a side aisle, peering out at him from behind some boards that had been nailed over its chamber. As he pried off the panels to get a better look, the thing watched him with big, red glass eyes wedged into its gaping sockets. It was propped upright, decked out in robes befitting a king, and holding out a glass vial, which Koudounaris later learned would have been believed to contain the skeleton’s own blood. He was struck by the silent figure’s dark beauty, but ultimately wrote it off as “some sort of one-off freakish thing, some local curiosity.”

But then it happened again. In another German church he visited some time later, hidden in a crypt corner, he found two more resplendent skeletons. “It was then that I realized there’s something much broader and more spectacular going on,” he says.

Koudounaris could not get the figures’ twinkling eyes and gold-adorned grins out of his mind. He began researching the enigmatic remains, even while working on Empire of Death. The skeletons, he learned, were the “catacomb saints,” once-revered holy objects regarded by 16th- and 17th-century Catholics as local protectors and personifications of the glory of the afterlife. Some of them still remain tucked away in certain churches, while others have been swept away by time, forever gone. Who they were in life is impossible to know. “That was part of this project’s appeal to me,” Koudounaris says. “The strange enigma that these skeletons could have been anyone, but they were pulled out of the ground and raised to the heights of glory.”

To create Saint Deodatus in Rheinau, Switzerland, nuns molded a wax face over the upper half of his skull and fashioned his mouth with a fabric wrap. (© 2013 Paul Koudounaris)

His pursuit of the bones soon turned into a book project, Heavenly Bodies: Cult Treasures and Spectacular Saints from the Catacombs, in which he documents the martyred bones’ journey from ancient Roman catacombs to hallowed altars to forgotten corners and back rooms. Though largely neglected by history, the skeletons, he found, had plenty to say.

Resurrecting the Dead

On May 31, 1578, local vineyard workers discovered that a hollow along Rome’s Via Salaria, a road traversing the boot of Italy, led to a catacomb. The subterranean chamber proved to be full of countless skeletal remains, presumably dating back to the first three centuries following Christianity’s emergence, when thousands were persecuted for practicing the still-outlawed religion. An estimated 500,000 to 750,000 souls—mostly Christians but including some pagans and Jews—found a final resting place in the sprawling Roman catacombs.

For hundreds of skeletons, however, that resting place would prove anything but final. The Catholic Church quickly learned of the discovery and believed it was a godsend, since many of the skeletons must have belonged to early Christian martyrs. In Northern Europe—especially in Germany, where anti-Catholic sentiment was most fervent—Catholic churches had suffered from plunderers and vandals during the Protestant Revolution over the past several decades. Those churches’ sacred relics had largely been lost or destroyed. The newly discovered holy remains, however, could restock the shelves and restore the morale of those parishes that had been ransacked.

The holy bodies became wildly sought-after treasures. Every Catholic church, no matter how small, wanted to have at least one, if not ten. The skeletons allowed the churches to make a “grandiose statement,” Koudounaris says, and were especially prized in southern Germany, the epicenter of “the battleground against the Protestants.” Wealthy families sought them for their private chapels, and guilds and fraternities would sometimes pool their resources to adopt a martyr, who would become the patron of cloth-makers, for example.

Saint Valentinus is one of the ten skeletons decorated by the lay brother Adalbart Eder. Valentinus wears a biretta and an elaborate deacon’s cassock to show off his ecclesiastical status. Today, he is housed in Waldsassen Basilica in Germany

For a small church, the most effective means of obtaining a set of the coveted remains was a personal connection with someone in Rome, particularly one of the papal guards. Bribery helped, too. Once the Church confirmed an order, couriers—often monks who specialized in transporting relics—delivered the skeleton from Rome to the appropriate northern outpost.

At one point, Koudounaris attempted to estimate in dollar terms how profitable these ventures would have been for the deliverymen, but gave up after realizing that the conversion from extinct currencies to modern ones and the radically different framework for living prevented an accurate translation. “All I can say is that they made enough money to make it worthwhile,” he says.

The Vatican sent out thousands of relics, though it’s difficult to determine exactly how many of those were fully articulated skeletons versus a single shinbone, skull or rib. In Germany, Austria and Switzerland, where the majority of the celebrated remains wound up, the church sent at least 2,000 complete skeletons, Koudounaris estimates.

For the Vatican, the process of ascertaining which of the thousands of skeletons belonged to a martyr was a nebulous one. If they found “M.” engraved next to a corpse, they took it to stand for “martyr,” ignoring the fact that the initial could also stand for “Marcus,” one of the most popular names in ancient Rome. If any vials of dehydrated sediment turned up with the bones, they assumed it must be a martyr’s blood rather than perfume, which the Romans often left on graves in the way we leave flowers today. The Church also believed that the bones of martyrs cast off a golden glow and a faintly sweet smell, and teams of psychics would journey through the corporeal tunnels, slip into a trance and point out skeletons from which they perceived a telling aura. After identifying a skeleton as holy, the Vatican then decided who was who and issued the title of martyr.

Saint Munditia arrived at the Church of Saint Peter in Munich along with a funerary plaque taken from the catacombs. (© 2013 Paul Koudounaris)

While there doubters within the Vatican, those on the receiving end of these relics never wavered in their faith. “This was such a dubious process, it’s understandable to ask if people really believed,” Koudounaris says. “The answer is, of course they did: These skeletons came in a package from the Vatican with proper seals signed by the cardinal vicar stating these remains belong to so-and-so. No one would question the Vatican.”

The Dirt and Blood Are Wiped Away

Each martyr’s skeleton represented the splendors that awaited the faithful in the afterlife. Before it could be presented to its congregation, it had to be outfitted in finery befitting a relic of its status. Skilled nuns, or occasionally monks, would prepare the skeleton for public appearance. It could take up to three years, depending on the size of the team at work.

The talented nuns of Ennetach decorated the ribcage of Saint Felix in Aulendorf. (© 2013 Paul Koudounaris)

Each convent would develop its own flair for enshrouding the bones in gold, gems and fine fabrics. The women and men who decorated the skeletons did so anonymously, for the most part. But as Koudounaris studied more and more bodies, he began recognizing the handiwork of particular convents or individuals. “Even if I couldn’t come up with the name of a specific decorator, I could look at certain relics and tie them stylistically to her handiwork,” he says.

Nuns were often renowned for their achievements in clothmaking. They spun fine mesh gauze, which they used to delicately wrap each bone. This prevented dust from settling on the fragile material and created a medium for attaching decorations. Local nobles often donated personal garments, which the nuns would lovingly slip onto the corpse and then cut out peepholes so people could see the bones beneath. Likewise, jewels and gold were often donated or paid for by a private enterprise. To add a personal touch, some sisters slipped their own rings onto a skeleton’s fingers.

Saint Kelmens arrived in Neuenkirch, Switzerland, in 1823 – decades after the original wave of catacomb saints were distributed throughout Europe. Two nuns decorated his bones. (© 2013 Paul Koudounaris)

One thing the nuns did lack, however, was formal training in anatomy. Koudounaris often found bones connected improperly, or noticed that a skeleton’s hand or foot was grossly missized. Some of the skeletons were outfitted with full wax faces, shaped into gaping grins or wise gazes. “That was done, ironically, to make them seem less creepy and more lively and appealing,” Koudounaris says. “But it has the opposite effect today. Now, those with the faces by far seem the creepiest of all.”

Saint Felix of Gars am Inn, Germany, was regarded as a miracle-worker. (© 2013 Paul Koudounaris)

They are also ornately beautiful. In their splendor and grandeur, Koudounaris says, the skeletons may be considered baroque art, but their creators’ backgrounds paint a more complicated picture that situates the bones into a unique artistic subcategory. The nuns and monks “were incredible artisans but did not train in an artisan’s workshop, and they were not in formal dialogue with others doing similar things in other parts of Europe,” he says.

“From my perspective as someone who studies art history, the question of who the catacomb saints were in life becomes secondary to the achievement of creating them,” he continues. “That’s something I want to celebrate.”

Devoted patrons often gave their own jewelry to the saints, such as these rings worn on the gauze-wrapped fingers of Saint Konstantius in Rohrschach, Switzerland. (© 2013 Paul Koudounaris)

In that vein, Koudounaris dedicated his book to those “anonymous hands” that constructed the bony treasures “out of love and faith.” His hope, he writes, is that “their beautiful work will not be forgotten.”

Fall from Grace

When a holy skeleton was finally introduced into the church, it marked a time of community rejoicing. The decorated bodies served as town patrons and “tended to be extremely popular because they were this very tangible and very appealing bridge to the supernatural,” Koudounaris explains.

Saint Gratian, another of Adalbart Eder’s Waldassen skeletons. Here, the saint is decked out in a re-imagining of Roman military attire, including lace-up sandals and shoulder, chest and arm guards. (© 2013 Paul Koudounaris)

Baptismal records reveal the extent of the skeletons’ allure. Inevitably, following a holy body’s arrival, the first child born would be baptized under its name—for example, Valentine for a boy, Valentina for a girl. In extreme cases, half the children born that year would possess the skeleton’s name.

Communities believed that their patron skeleton protected them from harm, and credited it for any seeming miracle or positive event that occurred after it was installed. Churches kept “miracle books,” which acted as ledgers for archiving the patron’s good deeds. Shortly after Saint Felix arrived at Gars am Inn, for example, records indicate that a fire broke out in the German town. Just as the flames approached the marketplace—the town’s economic heart—a great wind came and blew them back. The town showered Felix with adoration; even today, around 100 ex-votos—tiny paintings depicting and expressing gratitude for a miracle, such as healing a sick man—are strewn about St. Felix’s body in the small, defunct chapel housing him.

As the world modernized, however, the heavenly bodies’ gilt began to fade for those in power. Quoting Voltaire, Koudounaris writes that the corpses were seen as reflection of “our ages of barbarity,” appealing only to “the vulgar: feudal lords and their imbecile wives, and their brutish vassals.”

In the late 18th century, Austria’s Emperor Joseph II, a man of the Enlightenment, was determined to dispel superstitious objects from his territory. He issued an edict that all relics lacking a definite provenance should be tossed out. The skeletons certainly lacked that. Stripped of their status, they were torn down from their posts, locked away in boxes or cellars, or plundered for their jewels.

Catacomb saints were often depicted in a reclining position, as demonstrated here by Saint Friedrich at the Benedictine abbey in Melk, Austria. He holds a laurel branch as a sign of victory. (© 2013 Paul Koudounaris)

For local communities, this was traumatic. These saints had been instilled in people’s lives for more than a century, and those humble worshipers had yet to receive the Enlightenment memo. Pilgrimages to see the skeletons were abruptly outlawed. Local people would often weep and follow their patron skeleton as it was taken from its revered position and dismembered by the nobles. “The sad thing is that their faith had not waned when this was going on,” Koudounaris says. “People still believed in these skeletons.”

The Second Coming

Not all of the holy skeletons were lost during the 18th-entury purges, however. Some are still intact and on display, such as the 10 fully preserved bodies in the Waldsassen Basilica (“the Sistine Chapel of Death,” Koudounaris calls it) in Bavaria, which holds the largest collection remaining today. Likewise, the delicate Saint Munditia still reclines on her velvet throne at St. Peter’s Church in Munich.

In Koudounaris’ hunt, however, many proved more elusive. When he returned to that original German village several years later, for example, he found that a salvage company had torn down the forest church. Beyond that, none of the villagers could tell him what had happened to its contents, or to the body. For every 10 bodies that disappeared in the 18th and 19th centuries, Koudounaris estimates, nine are gone.

In other cases, leads—which he gathered through traveler’s accounts, parish archives and even Protestant writings about the Catholic “necromancers”—did pan out. He found one skeleton in the back of a parking-garage storage unit in Switzerland. Another had been wrapped in cloth and stuck in a box in a German church, likely untouched for 200 years.

After examining around 250 of these skeletons, Koudounaris concluded, “They’re the finest pieces of art ever created in human bone.” Though today many of the heavenly bodies suffer from pests burrowing through their bones and dust gathering on their faded silk robes, in Koudounaris’ photos they shine once more, provoking thoughts of the people they once were, the hands that once adorned them and the worshipers who once fell at their feet. But ultimately, they are works of art. “Whoever they may have been as people, whatever purpose they served rightly or wrongly as items, they are incredible achievements,” he says. “My main objective in writing the book is to present and re-contextualize these things as outstanding works of art.”

Only the head of Saint Benedictus – named in honor of Saint Benedict, the patron of the monastery – arrived in Muri, Switzerland, in 1681. (© 2013 Paul Koudounaris)

Accomplishing that was no small task. Nearly all the skeletons he visited and uncovered were still in their original 400-year-old glass tombs. To disassemble those cases, Koudounaris thought, would “amount to destroying them.” Instead, a bottle of Windex and a rag became staples of his photography kit, and he sometimes spent upward of an hour and a half meticulously examining the relic for a clear window through which he might shoot. Still, many of the skeletons he visited could not be included in the book because the glass was too warped to warrant a clear shot.

For Koudounaris, however, it’s not enough to simply document them in a book. He wants to bring the treasures back into the world, and see those in disrepair restored. Some of the church members agreed with Koudounaris’ wish to restore the skeletons, not so much as devotional items but as pieces of local history. The cost of undertaking such a project, however, seems prohibitive. One local parish priest told Koudounaris he had consulted with a restoration specialist, but that the specialist “gave a price so incredibly high that there was no way the church could afford it.”

Still, Koudounaris envisions a permanent museum installation or perhaps a traveling exhibit in which the bones could be judged on their artistic merits. “We live in an age where we’re more in tune with wanting to preserve the past and have a dialogue with the past,” he says. “I think some of them will eventually come out of hiding.”


The True Story Of The Aberfan Disaster, Featured In Season 3 Of ‘The Crown’

As the event that dominates the third episode of Season 3 of The Crown, the Aberfan Disaster remains one of the most devastating losses of human life in Welsh history. On the morning of October 21, 1966, the collapse of a soil tip triggered a slurry slide that ended 116 children and 28 adults in the village of Aberfan, Wales.

Located in Southern Wales, Aberfan was devastated by the disaster. Life revolved around nearby mining operations. As Aberfan residents carried out recovery and relief efforts, Queen Elizabeth II issued a statement – resisting the advice of Prime Minister Harold Wilson to visit the site of the tragedy. 

The events leading up to and in the aftermath of the Aberfan Disaster ultimately changed the role of royalty, the lives of countless Welshmen and women, and mining safety in Britain.

The Mine Near Aberfan Was Under The Authority Of The National Coal Board Of Britain

The Merthyr Vale Colliery included seven tips, the first of which dated back to 1869. In 1966, the colliery encircled Arberfan, a village that served as home to miners and their families. The Merthyr Vale Colliery was regulated by the National Coal Board (NCB), the overseeing body that was formed in 1947. The NCB nationalized mining in the United Kingdom, promoting the industry and setting production and distribution guidelines.

When Tip 7 of the Merthyr Valley Colliery was begun in 1958, it was built over an underground spring, creating an intrinsic instability. There were several tips at the mine built over these springs, resulting in several slips during the 1960s. In 1963, for example, an engineer at the mine noted, “danger from coal slurry being tipped at the rear of Pantglas School,” but the NCB failed to act on the warning. 

Aberfan Experienced Heavy Rains That Caused A Great Amount Of Ground Instability

October 1966 was a particularly rainy month for Aberfan and the surrounding region, with roughly 60 inches falling in the weeks preceding the disaster. As water filled streams and underground springs, the slag heap – where the mine discarded its waste – were susceptible to heavy rain, as well. 

Tip 7 began to show signs of weakness during the early hours on October 21, 1966. At around 7:30 am, mine workers observed settlement at the tip, something that increased over the subsequent hours. First 10 feet, then 10 feet more – the top of the tip was slowly giving way. Reportedly, the crew took a break, intent on working to remedy the problem as soon as they were done. 

A Collapse At Tip 7 Of The Mine Triggered A Slurry Surge That Struck A Nearby School

The students at Pantglas Junior School arrived for classes on Friday, October 21, 1966, expecting to enjoy the last day of school before their midterm break. The night before, 9-year-old Eryl Jones dreamed that school had been canceled for that day, describing “something black came down all over it” to her mother before she left home that morning.

When the school opened at 9 am, 240 students entered. However, within minutes, they heard what survivor Gaynor Madgewick described as:

A terrible, terrible sound, a rumbling sound. It was so loud. I just didn’t know what it was. It seemed like the school went numb, you could hear a pin drop. I was suddenly petrified and glued to the chair. It sounded like the end of the world had come. 

What Madgewick heard was a flood of slurry – a mixture of water, mud, and coal debris – descending the mountain as it approached the school. Other survivors described the sound as akin to, “a jet plane screaming low over the school in the fog.”

As the slide began, one of the workers at Tip 7 observed, “It started to rise slowly at first, sir… I thought I was seeing things. Then it rose up pretty fast, sir, at a tremendous speed. Then it sort of came up out of the depression and turned itself into a wave… down towards the mountain… towards Aberfan village… into the mist.”

Children Later Recalled Struggling To Breathe While Buried Under Waste

When the slurry hit Pantglas Junior School, children and teachers alike were immediately buried under “a [slurry] wave over 12 meters high and 7 meters wide traveling at speed down the valley.” 

There had been no warning since the telephone cables leading to the tip had been taken. As it approached the school, it wiped out the entire landscape, eventually leaving 6 to 9 meters of debris. Brian Williams, 7 years old at the time, “watched the classroom wall split from the bottom to the top. The wall came through and stopped. And the next thing I remember was it went very quiet, and then a lot of screaming and crying.” Williams had escaped being under the crumbling wall, having been shifted to another desk across the room moments before.

Survivor Jeff Edwards remembered “waking up [and] my right foot was stuck in the radiator and there was water pouring out of it. My desk was pinned against my stomach and a girl’s head was on my left shoulder. She was dead. Because all the debris was around me I couldn’t get away from her. The image of her face comes back to me continuously.”

Edwards spent the next 90 minutes listening to the “crying and screaming” of his classmates, but “as time went on they got quieter and quieter as children died, they were buried and running out of air.” He, too, struggled to breathe as he lay under the mixture of coal, water, and mud. 

Residents And Professional Miners Alike Tried To Dig To Find Survivors

Miners, bystanders, and municipal authorities frantically rushed toward the school. When police officer Yvonne Price, 21 years old at the time, arrived, she “was rigid with shock… you could see doors, tables, kitchen utensils floating in” black water. She witnessed “people from the village passing saucepans and buckets full of debris.”

The New York Times later reported, “Civil defense teams, miners, policemen, firemen and other volunteers toiled desperately, sometimes tearing at the coal rubble with their bare hands, to extricate the children. Bulldozers shoved debris aside to get to the children. A hush fell on the rescuers once when faint cries were heard in the rubble.”

Due to her small size, Officer Price was sent through a hole in the ground to see if she could find any survivors. She found none. 

Recovery efforts continued long after cries from under the debris could be heard. Alix Palmer, a journalist at Aberfan, saw, “the fathers straight from the pit… digging… no-one had yet really given up hope, although logic told them it was useless.” Every time a body was found, people would pause as a doctor made his way to check for signs of life. The last surviving child, Jeff Edwards, was pulled to safety at around 11 am.

Men and women continued to dig, pulling 67 bodies out of the rubble on the first day. One of the teachers, David Beynon, was discovered with five children in his arms. He had tried to protect them in their final moments. Nansi Williams, the school’s dinner lady, was collecting money when the slurry hit the school and she, too, lost her life protecting several students. All of the five children she covered with her body survived. 

The Bodies Of Children Were Identified By Items They Had In Their Pockets

When Reverend Irving Penberthy arrived on the scene of the Aberfan Disaster, he “stayed with the people who were watching and waiting” before taking his post at the Bethania Chapel. Soon, the chapel became a mortuary, one that received the bodies of children as they were extracted from under the slurry. Penberthy recalled watching as “fathers – it was mainly fathers, of course, not the women – just going around and lifting the blanket, and then going on further, and the shock when they finally found their own child. That was dreadful. And all we did was just cry together.”

As more and more bodies arrived, Charles Nunn, assigned as the senior identification officer at Aberfan, wrote, “a description of each child or adult and detail any possessions in their pockets – a handkerchief, sweets, anything that might help with identification. The little ones were laid on the pews, the adults on stretchers across the tops of the pews – males to the left and females to the right. By about the fourth or fifth day we had to start taking bodies up a difficult winding staircase to the upstairs gallery.”

While many of the children perished as a result of asphyxiation; there were some bodies that were deemed unsuitable for viewing due to extensive injuries. In a letter to her mother, journalist Alix Palmer wrote, “the slag had had time to corrode the skin of the children still buried and many brought out burned could only been identified by the clothing or things in their pockets. One little boy… was identified by a slip of paper with his name on deep inside his wallet.”

The Queen Resisted Efforts To Get Her To Visit The Site

As details of the disaster emerged and bodies continued to be pulled from the debris (dozens on the first day alone), Queen Elizabeth II resisted pleas to visit Aberfan. Just as it was depicted in the third season of The Crown, the monarch opted to send a proxy – her husband, Prince Philip.

In her initial statement, she expressed sadness and sorrow. While the show indicated a lack of emotion on the part of the queen, it’s been asserted that she didn’t want to pull attention and resources away from rescue efforts. She was said to have insisted, “People will be looking after me… perhaps they’ll miss some poor child that might have been found in the wreckage.”

The British government was represented by Harold Wilson, the Prime Minister, and Lord Snowdon Antony Armstrong-Jones, Princess Margaret’s husband. The latter, according to Prime Minister Wilson, “made it his job to visit bereaved relatives… sitting holding the hands of a distraught father, sitting with the head of a mother on his shoulder for a half an hour in silence.”

Prince Philip spent two hours with relatives of victims, surveying the site, and visiting the cemetery where more than 81 children had already been laid to rest. 

The Queen Did Make Her Way To Aberfan, Visiting The Day After The Last Body Was Recovered

Queen Elizabeth II arrived in Aberfan more than a week after the disaster struck and only one day after the last body was retrieved from the debris. When she and Prince Philip toured Aberfan on October 29, 1966, they were both visibly moved by the experience. As a young child handed Elizabeth a flower -“From the remaining children of Aberfan” – the stoic queen was said to have been on the brink of tears. According to Jeff Edwards, the last child to be found alive, “We know she did cry, because she went to Jim Williams’ house – and when she came down from the cemetery she was visibly crying.”

When the queen spoke to her subjects at Aberfan, she told them, “As a mother, I’m trying to understand what your feelings must be…  I’m sorry I can give you nothing at present except sympathy.” The queen’s former private secretary, Lord Charteris, told author Gyles Brandreth that not going to Aberfan earlier was one of her biggest regrets.

Survivors see her visit differently, however. Edwards, again, noted, “When she did arrive she was visibly upset and the people of Aberfan appreciated her being here. She came when she could and nobody would condemn her for not coming earlier, especially as everything was such a mess.” Marjorie Collins, the mother of one of the victims, similarly saw the visit as a supportive endeavor, observing, “They [Prince Philip and Queen Elizabeth] were above the politics and the din and they proved to us that the world was with us, and that the world cared.”

The Disaster Could Have Been Prevented Had Earlier Concerns Been Addressed

In his comments about the disaster at Aberfan, the chairman of the National Coal Board (NCB), Lord Robens, noted the impossibility of knowing “that there was a spring in the heart of this tip [meaning Tip 7].” 

The inquest and tribunal into the cause of the slide that took 144 lives thought otherwise, calling the event “a terrifying tale of bungling ineptitude by many men charged with tasks for which they were totally unfitted, of failure to heed clear warnings, and of a total lack of direction from above.”

The tribunal took place over 76 days, interviewing 136 witnesses and examining 300 exhibits. Earlier concerns about the tips were made very clear, as was the lack of NCB policy when it came to safely installing tips. In his testimony, Lord Robens ultimately admitted fault by the NCB, something with which the tribunal agreed, concluding in 1967:

Blame for the disaster rests upon the National Coal Board. This is shared, though in varying degrees, among the NCB headquarters, the South Western Divisional Board, and certain individuals… The legal liability of the NCB to pay compensation of the personal injuries, fatal or otherwise, and damage to property, is incontestable and uncontested.

No malice or criminality was found, but it was determined that the entire disaster could have been avoided but for “ignorance, ineptitude and a failure in communications.”

New Legislation Was Introduced In 1969 To Tighten The Oversight Of Mines 

Mining regulations became increasingly stringent in the years after Aberfan. New legislation was, according to Prime Minister Harold Wilson in 1967, “desirable” in light of the recommendations made by the tribunal. When Wilson saw the findings of the Aberfan tribunal, he was shocked and deeply concerned by its “devastating nature.” 

In 1969, two years after the tribunal’s findings, Lord Robens headed efforts that resulted in the 1974 Health and Safety at Work Act, legislation that continues to regulate mining in the United Kingdom. Although Robens had offered his resignation to the NCB, it was dismissed by members of Parliament and Prime Minister Harold Wilson – something that only contributed to Robens’s villainy in the eyes of the victims of the disaster.

In addition to the 1974 act, the Mines and Quarries (Tips) Act of 1969 and subsequent Mine and Quarries (Tips) Regulations of 1971 also brought standardization of mine building, construction, and management. According to the latter, any tipping activities required plans “showing all mine workings (whether abandoned or not), previous landslips, springs, artesian wells, watercourses and other natural and other topographical features which might affect the security of the intended tip or might be relevant for determining whether the land on which the tipping operations are to be carried out is satisfactory for the purpose.”  

In 1999, additional quarry regulations were put into effect, tightening oversight of waste materials including, “but… not limited to, overburden dumps, backfill, spoil heaps, stock piles and lagoons.”

Families Impacted By The Disaster Were Paid £500 By The National Coal Board 

A fund to support Aberfan and its community was established almost immediately after the disaster. A total of £1,750,000 – a sum worth more than £20 million today – was raised to rebuild the village and pay for medical care. Because the National Coal Board (NCB) refused to pay for the removal of the tips that still sat high above Aberfan, the money was used to bring those down, as well. In 1997, the British government repaid Aberfan the £150,000 from the fund that went toward the tip removal. 

The NCB offered each of the families impacted by the disaster £50 as an opening payment, a sum that later rose to £500. The Charity Commission of the NCB once considered asking parents, “Exactly how close were you to your child?” before paying out – presumably, parents who were not close to their children would not receive compensation – but decided against that option. The “generous offer” of £500 was paid to the families in 1970. 

Money would not cure the psychological scars in Aberfan, however. Survivor Jeff Edwards continues to struggle with survivor’s guilt, while families in Aberfan experienced a “strange bitterness between [those] who lost children and those who hadn’t; people just could not help it.” Post-traumatic stress disorder plagues the entire community and, while psychiatrists were initially brought in, “They didn’t really know how to deal with it and it wasn’t much help. There were sessions and we were offered different drugs.” 

Thirty-three years after the disaster, researcher Louise Morgan found that survivors “talked about the fear evoked at the sound of a lorry passing their house, or of an aircraft flying overhead. Intense memories are aroused by the slightest noise or smell. A number now have children the age they were. This seems to arouse new feelings.”

The Queen Made Repeated Visits To Aberfan In Support Of The Community

Queen Elizabeth II may have received criticism for delaying a trip to Aberfan in 1966, but she has made numerous trips to the Welsh town in support of its recovery. In 1973, she visited to attend the opening of a new community center and placed a wreath at a local memorial. While there, she called the community center “a symbol of the determination that out of the disaster should come a richer and fuller life.”

When she returned in 1997, she planted a tree in the Garden of Remembrance, again speaking to survivors and relatives of those who perished.

Another visit in 2012 saw the queen opening a new school, something that, according to Elaine Richards, was part of a promise Elizabeth had made decades earlier. Richards, who lost her daughter Sylvie in 1966, noted, “She kept her promise, she is a very gracious lady… Now we have children playing in the village again.”