plant lover, cookie monster, shoe fiend
17068 stories
·
20 followers

‘They call us the fatherless ones’: the trauma of families devastated by the infected blood scandal will last for generations

1 Share

On the day of her uncle’s funeral in 1995, Jane’s life changed forever.* That was when she found out her uncle Edward, a person with haemophilia, had been infected with human immunodeficiency virus (HIV) from the treatment he was taking for his condition.

Adding to the family’s pain, the stigma that surrounded HIV and the disease it causes, Aids – because of its association with homosexuality and drug addiction – meant they kept the cause of Edward’s death to themselves. At the same time, they knew that Jane’s father, Roy, also had haemophilia and had been receiving the same treatment as his brother.

A rare genetic condition means that throughout their lives, people with haemophilia – of whom there are around 6,000 in the UK – must seek medical care when they bleed because one of their key blood clotting proteins, factor VIII or IX, is either partly or completely missing. In the 1970s and 80s, a new treatment to give people with haemophilia their missing protein using concentrated blood plasma was seen as potentially life-changing. In fact, it dealt many of them a death sentence.

Read more: Infected blood scandal – what you need to know

The factor VIII concentrate was supplied by US pharmaceutical companies. Donors were paid for their blood, and much of it came from communities at higher risk of carrying infectious disease, including drug addicts and people in prison.

Gradually, haemophilia communities on both sides of the Atlantic noticed some among them were getting sick from a mysterious new virus. The first death of a person with haemophilia from Aids occurred in the US state of Florida in January 1982. The following year, both the Lancet medical journal and the World Health Organization published recommendations that people with haemophilia should be warned of the new health risks they faced – which also included infection with hepatitis C, a potentially deadly virus that affects the liver. Yet no such warnings were given.

While Edward soon became ill with HIV, Jane’s father did not reveal his hepatitis C infection, even to his daughter, until she was 18. He later died from liver cancer. Jane recalls:

My dad died ten years ago now – it’s nearly his anniversary. When he died, I went back to the doctors and said: ‘Do you think the hepatitis has caused the issues with his liver?’ The room fell silent. I didn’t need an answer. Their body language, their silence, told me everything I needed to know.

Jane says her father’s mistrust of doctors and medical advice meant he avoided the factor VIII treatments unless he really needed them, and “in some respects that prolonged his life” by limiting the amount of infected concentrate he was subjected to. One of Jane’s earliest memories is of him refusing to go to hospital, despite intense pain from a bleed into his joint. But each of these bleeds caused new damage to Roy’s body, resulting in increasing pain and disability as his life went on.

This article is part of Conversation Insights
The Insights team generates long-form journalism derived from interdisciplinary research. The team is working with academics from different backgrounds who have been engaged in projects aimed at tackling societal and scientific challenges.

The societal stigma surrounding Aids meant many people with haemophilia lived with their infections in silence – assuming, that is, they were aware of their diagnosis. Another shocking aspect of this global contaminated blood scandal is that often, the victims weren’t being told the truth themselves.

During a recent conversation with her mother, Jane discovered that, for a long time, her father and uncle had not been told of their infections by doctors who by then knew about the problem of contaminated blood, leaving her family at risk of catching hepatitis C and her uncle at risk of passing on HIV. In her father’s case, it was only when, in 2004, he was notified by the NHS that factor VIII concentrate carried a very small risk of Creutzfeldt-Jakob disease (CJD) – a rare and fatal brain disease better known in the UK as “mad cow disease” – that he was informed this was because of his hepatitis C infection. Jane recalls:

My dad was like: ‘Excuse me, what?’ It was the same for my uncle Edward. There was no formal notification [of his HIV diagnosis] – the doctors and nurses just suddenly started wearing a lot of blue gloves around him.

Jane’s own story encapsulates the multigenerational impact of the infected blood scandal, which I (Sally-Anne) have researched with colleagues at the University of Gloucestershire. Jane carries the haemophilia gene, which is passed from mother to son with a 50% chance, and one of her two sons has haemophilia. Jane recalls the moment she told her father Roy, who was already infected and unwell with hepatitis C, that she was having a son:

We bought a blue romper suit and I took it home and gave it to my dad. He opened the bag and just threw it back at me. He went: ‘No, I can’t deal with this.’ And that’s not okay – he should have been proud, excited.

When Jane’s son was born, it was difficult for the family to face up to the treatments for haemophilia that would be a regular part of his life. She recalls her father “holding our newborn child, begging me not to ever let him have these treatments”.

‘A criminal cover-up on an industrial scale’

The infection of people with haemophilia is just one aspect of the global contaminated blood scandal – which in the UK is regarded as the “worst treatment disaster in the history of the NHS”. In total, around 30,000 NHS patients were infected with HIV and hepatitis C between 1970 and 1991, either through contaminated blood products such as factor VIII and IX or blood transfusions during surgery, treatment and childbirth.

Recently Sam Roddick, daughter of Body Shop founder Anita Roddick, wrote in the Sunday Times about a “chain of decisions that were morally unlawful” which led to her mother contracting hepatitis C from a blood transfusion after giving birth to Sam in 1971. The blood used for transfusions, which is donated for free in the UK, was not routinely screened for HIV until 1986 and hepatitis C only five years after that.

One person still dies every four days in the UK as a result of having received contaminated blood. An estimated 26,800 people became infected with hepatitis C and 1,243 with HIV. Of those infected with HIV, 380 were children – more than half of whom have died. Following earlier inquiries by Lord Archer and the Scottish government (which was branded a “whitewash” by some of those affected), the UK’s infected blood public inquiry was finally announced by the then-UK prime minister, Theresa May, in July 2017. She called the scandal an “appalling tragedy which should simply never have happened” – adding:

Today will begin a journey which will be dedicated to getting to the truth of what happened and in delivering justice to everyone involved.

A few months earlier, in his final speech as an MP in April 2017, Labour’s health secretary Andy Burnham had described the scandal as a “criminal cover-up on an industrial scale”, suggesting there might be a case for corporate manslaughter charges. Of people like Jane’s father and uncle with haemophilia, Burnham said:

The Department of Health, and the bodies for which it is responsible, have been grossly negligent of the safety of people in the haemophilia community over five decades.

Like so many family members, Jane’s life plans as a young woman were turned upside down by her father’s illnesses. One of hundreds of witnesses heard during the seven-year inquiry, Jane wants the long-awaited final report, which will be published on May 20, to recognise the suffering of all those affected by the scandal, explaining:

I don’t think there’s been any real recognition for the families and what they’ve been through. People and families in particular have been destroyed by this. I was at university trying to be a teacher but dropped out, much to my university’s dismay. I wanted to be at home to stay with dad. There’s a generation of us that have lost our families – they call us ‘the fatherless ones’.

Many of those affected by the scandal blame the UK government and NHS trusts who they claim knew but did not share information about a potential infection risk with those taking the new treatment.

Deaths, loss, and continued denial

In January 1982, one of the UK’s leading experts in haemophilia, Arthur Bloom, co-wrote an infamous letter to haemophilia centres throughout the country, telling them that it was very important to ascertain whether a new American blood product already being given to people with haemophilia in the UK showed reduced levels of hepatitis C. “As far as we know,” he wrote, “the products have been subjected to a heat treatment process”, adding:

Although initial production batches may have been tested for infectivity by injecting them into chimpanzees, it is unlikely that the manufacturers will be able to guarantee this form of quality control for all future batches.

This method of producing factor VIII protein involved taking large amounts of blood (up to 40,000 units) from many different people and reducing this to a concentrate that could be easily self-injected at home. Bloom suggested “the most clearcut way” of testing the infectivity of the new heat-treated product was on patients requiring treatment who had not been previously exposed to large-pool concentrates – including children.

One of the children treated by Bloom himself at the University Hospital of Wales was Colin Smith, who had haemophilia and weighed just 13 pounds when he died of Aids in 1990 at the age of seven. He was a year old when he was given the factor VIII treatment, and his HIV status was confirmed at two-and-a-half. The stigma of HIV meant the family were shunned by many in their community, including having the words “Aids dead” painted on the side of their house in six-foot high letters. As Colin’s mother, Janet Smith, recently told BBC Wales:

We were known as the Aids family … We’d have phone calls at 12, one o'clock in the morning, saying: ‘How can you let him sleep with his brothers? He should be locked up, he should be put on an island’… He was three.

The same BBC investigation found evidence that Bloom had ignored internal NHS guidelines, written by his own department, that discouraged the use of the imported factor VIII treatment on children because of the risk of infection. Bloom was clearly aware of the risks when he began treating Colin in the autumn of 1983. “This wasn’t an accident,” Colin’s father said. “It could have been avoided.”

None of the young patients, known as “previously untreated patients”, or their parents knew they were part of a nationwide experiment at the time. Documents subsequently released reveal that the UK government funded some of these studies – including one of pupils at Treloar’s College, a specialist school in Hampshire with an NHS Haemophilia unit on site. Of 122 pupils with Haemophilia attending the school between 1974 and 1987, to date 75 have are reported to have died as a result of HIV and hepatitis C infections.

By 1984 – just over two years after the first death from Aids in the UK – government experts were aware that people receiving American factor VIII blood concentrate were at risk of HIV infection. Yet despite the mounting evidence, denials and silence continued well into the 1990s.

Trevor Graham, one of the hundreds of contributors to the infected blood inquiry, spoke to us about his father, who had haemophilia and died in 1991 when Graham was only 13. “We had no idea at the time he had died of Aids,” Graham explains. “We thought he died of a brain haemorrhage, as that was what the doctors treating dad at the Manchester Royal infirmary told my mother.”

Yet for the four years before his death, Graham’s father had been unable to work and sought the support of the Macfarlane Trust, a discretionary grant-making trust that was set up and funded by the then-Department of Health to “alleviate the financial needs of those haemophiliacs infected with HIV through contaminated NHS blood products”, and also their families. Graham says:

It is heartbreaking to read the letters my dad wrote requesting assistance, one of which states that he was concerned about Christmas presents for myself and my sister. In that letter, he stated he was HIV positive and couldn’t work as a result of his infection.

Despite there being no reference to HIV on their father’s death certificate, Graham says rumours soon spread around their school and local community. Once again, the legacy of this infection continues to affect following generations:

My sister and I were bullied at school. People said that our dad was gay and that he died of Aids. Mum became agoraphobic when I was 13 and was advised to see a psychiatrist, but in her grief she refused. I was suffering from hidden anxiety as a young teenager and developed a stutter. The anxiety and bouts of depression have never left me since my dad passed away. Even 30 years later, I still struggle with my mental health.

A monster arrives

“The monster arrived as a wolf in sheep’s clothing,” writes Elaine DePrince in her moving memoir about the contaminated blood scandal in the US, Cry Bloody Murder. The monster was factor VIII concentrate created from blood infected with HIV and hepatitis C. Three of her sons had haemophilia; all three would die slow, painful deaths due to Aids, having been infected by the treatment that was meant to help them lead normal lives:

When Teddy died, he was the last of our three boys with hemophilia and Aids to leave us. He was the last of our three little boys, our three musketeers … He was 24 years old, and it seemed like he had lived forever with Aids.

In the book, DePrince, whose family were living in a suburb of Philadelphia, describes an earlier conversation with her husband when a warning label finally appeared on vials of factor VIII concentrate. She pointed out there was no need to worry, as all three of their sons with haemophilia were already infected with HIV.

As their youngest son Cubby’s condition worsened, he wrote a list to ease his concerns about other children getting Aids, at a time when it was untreatable, entitled “64 reasons why you do not want to get AIDs”. These included:

If your liver gets too big, you have to sit half-lying down and half-sitting up. Then it’s hard to paint your model airplanes because the paint drips on your stomach.

The battle to gain justice took DePrince from writing letters to campaigning for a change in the law and writing a book to explain the reality of the contaminated blood scandal and her family’s suffering from it. She concludes:

I cannot repress my sorrow, my pain, and my rage … The FDA [US Food & Drug Administration] failed my children. The blood-banking industry failed them. Government agencies failed them. The law failed them.

Jonathan is a haematologist in the US who comes from a family of men with haemophilia. When he was around seven years old in 1989, both his uncles were infected with HIV. One died in 1992 and the other shortly afterwards. “Our family and the haemophilia community were ravaged – we lost an entire generation. I had to watch my uncles deteriorate over the years.”

Jonathan, who also has haemophilia, grew up in a rural suburb in Illinois. He reflects on how that made getting treatment all the harder for his uncles:

It turns out that not only was there the contaminated supply that ravaged an entire generation of people with haemophilia and other severe bleeding disorders, but there wasn’t even equal access to care in the US at that time. Growing up in the Midwest, we didn’t have the same HIV therapies available on the east and west coasts of the US, where HIV research was being done. Some of the medical innovations at that time really did not penetrate the heartland of the US like it did on the coasts. So, I just had to watch my uncles deteriorate.

Jonathan himself was “only” infected with hepatitis C from his treatment. He says “that actually made me feel guilty – why was I spared [from HIV and Aids]? You know, everyone else is dying. Why should I be alive?”

The experience drove him to become a doctor in haematology, in order to try to make the experience better for other families like his:

People have been left to suffer. I grew up not knowing if I was going to live. The sad thing now, being a physician, is that HIV is such a manageable disease now.

The fight for justice

Across the world, many people have devoted their lives to fighting for justice for all those affected by the contaminated blood scandal. In the UK, groups such as TaintedBlood, Birchgrove Group, Factor 8, BloodLoss Families, Contaminated Blood Campaign, Contaminated Whole Blood UK and many others have continued the brave battles of the early whistleblowers and campaigners.

Jason Evans’ father Jonathan, who had haemophilia, was infected with HIV and hepatitis C and died in 1993 aged 31, when Evans was four. He has been campaigning for justice for his father and others for more than a decade, using freedom of information acts to reveal documents relating to the scandal. In one shocking memo from 1985, a UK government official discussed the financial implications of the fact that many people with haemophilia who were infected with HIV would soon die:

Of course, the maintenance of the life of a haemophiliac is itself expensive, and I am very much afraid that those who are already doomed will generate savings which more than cover the cost of testing blood donations.

Evans, the founder and director of the campaign group Factor 8, is leading a legal action against the UK government for more than 500 people. Their action resulted in permission to launch a High Court action to seek damages but is currently on hold pending the outcome of the current inquiry on May 20.

Evans has expressed concern that ministers are “seeking to water down” the inquiry’s strong recommendations from the interim reports. He recently told the Guardian:

What I want from the inquiry is it finally to be on the official record that what happened was entirely preventable and was motivated by unethical practices. For decades, the line from government was that this was an unavoidable accident that no one could have possibly have foreseen – that no one did anything wrong.

In November 2022, “interim” compensation payments of £100,000 each were made to around 4,000 infected people or their bereaved partners in the UK (on top of an “ex gratia payment” by the government in 1990 of £20,000 or £25,000, depending how badly a patient’s body had been damaged by their infection). But this has left many others affected by the scandal, including those who have lost their children or parents, without any compensation – along with those whose death left nobody behind to claim.

However, a recent amendment to the Victims and Prisoners Bill added a requirement for the UK government to set up a compensation scheme within three months of it passing on May 1. On May 5, The Times reported that ministers were preparing a compensation package of £10 billion minimum for contaminated blood victims; the details are to be announced after the public inquiry’s report is released.

Two court cases are in progress in the UK: the one led by Evans, and another against Treloar College by 36 former students, who claim the experiments on them breached its duty of care by giving the treatment without discussing the risks with the students or their parents. In 2023, in testimony to the inquiry, the college’s former headteacher, Alec Macpherson, admitted that doctors at the school were “experimenting with the use of factor VIII”.

Elsewhere, criminal proceedings were brought against government officials and executives in pharmaceutical companies as long ago as the 1990s, with French and Japanese officials being given prison sentences. In 1997, Bayer and the other three manufacturers of the factor VIII concentrate paid out a total of US$660 million (around £1 billion in today’s prices) to the estimated 6,000 people with haemophilia who were infected in the US.

There is also the potential for criminal charges or other consequences for those involved in the UK scandal. It is possible that those identified as responsible may be charged with gross negligence manslaughter, and, in the case of collective fault of an organisation, corporate manslaughter charges could be brought. Individuals who supplied the contaminated blood could be prosecuted for grievous bodily harm.

Campaigners often use the phrase “justice delayed is justice denied” – not least for the one person infected with contaminated blood who continues to die every three days in the UK. But the effects of this medical scandal will be felt for years and generations to come – and whatever the outcome of the inquiry, campaigners will continue to fight for justice. As Evans explained when he was nominated for an award in 2021:

I think something that fuelled our renewed campaign was a new energy, particularly from those whose parents had died. We were grown up now and we were angry. I think that energy spread to the older campaigners who had been let down by the government time and time again.

This complex, seven-year inquiry was forced to delay its final report for five months to allow the many people and organisations referenced sufficient time to respond. Some victims have found out things they did not know about their treatment. Others have called for national memorials for the victims in each UK countries – including one specifically for the children infected at Treloar College.

The inquiry has affected people in different ways. Some have felt compelled to attend every sitting. Harrowing testimony has been heard throughout – not least when Colin and Janet Smith spoke about their son Colin, the youngest person to have been infected in the UK. His father told the inquiry:

There’s no way a child should have to die the way Colin did. It wasn’t pleasant. It still affects us now. But it’s not just our son – there’s lots of children who have had to go through that … I would cope with death, but not with the death of my son. I still have trouble today; the fact that he’s in a grave on his own. The guilt will never go away.

*Some names in this article are pseudonyms, created to protect the identity of our interviewees.

For you: more from our Insights series:

To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.

Read the whole story
sarcozona
3 hours ago
reply
Epiphyte City
Share this story
Delete

Mass production of ornamentation and its recent decline | MetaFilter

1 Share

I am grateful that my house was built in 1876, because the roofline is very ornate, and the interior of the house has all sorts of plaster crown moulding. Do I love plaster from a home maintenance perspective? I do not. But the moulding is gourgeous. There are also some killer ceiling medallions.
posted by grumpybear69 at 12:20 PM on May 17 [3 favorites]

Paint the goddamn things for fucks sake. Every single fuckin photo in there looked naked, where the fuck are the painters!! Why is nothing painted in the world anymore, and if it is, the blandest mono colour they can manage. There are hundreds of millions of paintings roving around the world with nothing to paint on. Every grey concrete suface you see is a sign of failure, a blank canvas humanity has organized itself so pathetically comically poorly that the surface will never be painted, and if it is, the society will bizarrely pay to remove it, determined to have as boring and ugly as fuck world as this wretched and wonderful species wants it to be. I'm sick of trying to appreciate the "natural" beauty of stone and cement and I'm sick of acting like "natural" means anything, especially in the context of an artificial contraction and covering them with paints that are equally as natural as everything a human ever uses, or whatever that irritating term was ever meant to mean. Sloppy aimless rant but the oppressive greys of this world really gets me red in the face.
posted by GoblinHoney at 12:38 PM on May 17 [21 favorites]

It's a good article. Not sure if I agree with its conclusions, but thought-provoking and worth engaging with.
posted by biogeo at 1:40 PM on May 17 [2 favorites]

My dream is that sometime soon green ornamentation will become the new modern, new building design will incorporate creative ways for plants to ornament as many external surfaces as possible, and buildings without it will look naked and old-fashioned.
posted by trig at 2:02 PM on May 17 [9 favorites]

The Baha'i temple is quite beautiful. Part of what makes it so is that it is itself a kind of ornamentation, overlooking the lakeshore.
posted by HearHere at 2:09 PM on May 17 [3 favorites]

This is fantastic, and a beautiful website I had never seen before - thank you!
posted by superelastic at 2:35 PM on May 17 [1 favorite]

That’s a really good point, rebent. You can buy a fiberglass Corinthian column for your front porch for a couple hundred bucks, but there’s no cheap way to obtain a 20-foot floor-to-ceiling window.
posted by Just the one swan, actually at 3:39 PM on May 17 [1 favorite]

This article suffers from a sort of humanities version of engineers' disease. It's all about the details rather than the actual understanding of the underlying problem. From ancient times ornamentation has had several functions. The most basic architectural function is that when two building components meet, there will be some sort of a seam (I'm not sure I'm using the correct terminology in English, but I hope you get my point), and this seam was very hard to make perfect. This was not only an aesthetic issue, cracks are where the light gets in, but also where moisture, dirt, cold air and pests get in. So you would cover the seam with a profile that could in a way connect and close the components. This why one would get the most ornamentation everywhere things met: around doors and windows, where the wall met the ceiling and the floor, and around the hooks that carried chandeliers or wall-mounted lamps. On the outside of the building, the critical points were where the building met the ground, and where the walls met the roof, and again around openings. For architects, it could be very interesting to use these details as a form of expression. The classical orders represented different human or godly properties, like strength or bounty.

The orders are very interesting. Vitruvius, writing in the 1st century, describes 3. I think we have a couple more that are broadly seen as classical, and then there of course plenty others in other cultures. But the point of orders are that they define a system. Imagine a building site in classical antiquity. The architects and clients and some other people were very learned people with lots of knowledge of international architecture that they found through travels. But the majority of workers were illiterate. There were no blueprints, and while there definitely were drawings and models on site, they weren't spread out all over the place. So there had to be a common language that could be conveyed to everyone on site: the orders. An architect and contractor could enter the site and tell everyone: we are going to build a Doric temple, with these basic measurements like this model, and then everyone would know how to do, because the system covered every aspect of the building: the general outline, the columns and their decorations and all the other details of the building. Very cool.

Then on top of the system, there were the functions of symbolism, including showing one's wealth and/or purpose in life. This is where the decorations on the surfaces come in, including stained glass from the late Middle Ages onward. Sometimes the client would have a very strong desire to have narratives in the space, and downplay the spatial interest in favor of rich paintings or tapestries. The Sistine Chapel is pretty boring, spatially, and I am inclined to believe this was on purpose, because the Popes really wanted to send a message through the imagery. But the images could also take the form of carvings or stucco. All of these images had their own life, independently of the architecture, though the artists and artisans would most often work with the space in different ways. Also, the decorations weren't always figurative, since color and materials had meanings in themselves. In Islamic architecture, depictions of humans and animals are often not allowed, so the decoration may be a mix of calligraphy and geometric designs, both praising God.

All good. This hierarchy of a construction system (which obviously changed over time) and a meaningful decoration worked fine for at least 4000-ish years. Then during the 18th century it began to fall apart, mostly because of the beginning industrialization, but in the beginning NOT because of the industrialization of building parts, but because of the new generations of wealthy people who felt less attached to the old orders. This is not a pun. There is a reason we use order to describe societal rigor as well as architectural systems. The people of the enlightenment were not convinced that the old systems and moral narratives were appropriate ways of understanding the world, and they began to challenge the conventions, with oriental follies and decorations that had no other meaning than to delight the spectator. Classicism didn't disappear, but it became a style, alongside all the other historical and global styles.

Then during the 19th century, building components did become industrialized, and relatively cheap. Everyone could have all the ornaments, and they mostly did. But in that new context, the original purposes and meanings of the ornaments and decorations were almost entirely lost. Ornaments were just thrown randomly all over facades and interiors. There were some heroic attempts to return to order, for instance by Louis Sullivan in Chicago and Adolf Loos in Vienna. Contrary to how they are read today, they were both architects who fully mastered their order and ornamentation. People forget that the original purpose of the Bauhaus was to educate artisans to build future cathedrals. And there are still architects who work in that tradition. But mostly it was a vulgar mess and a lot of really bad construction. The reason we don't know so much about it is that a lot of 19th century buildings have been torn down because they were unsafe.

Young architects during the first decades of the 20th century dreamt of returning to the local vernacular architectures of the different regions. Using local materials and methods and letting the meaning grow out of the proces and functions.

After WW1, some realized that things were completely different. The shapes of the old orders had grown organically out of timber and stone construction. What could it mean that construction in the future would be based on steel and concrete? What properties do these materials have that in their own way can form the basis of a new organic order? They knew that it was possible to make a cast-iron Corinthian column, but also that that column would lack the beauty and precision of a column carved in stone. They knew it was possible to cast a profile in concrete, but also that it would lack the luminance and delicacy of a plaster molding. On the other hand, they know from engineering works that specially steel could accommodate a very high degree of precision in assembly, even when it was standard components. And that concrete could be shaped into organic forms that had never been seen before.

In these buildings, ornaments would have undermined the narrative and the order.

This is too long, but I need to write a little bit about the curtain wall. Most buildings now are built on the principles of the curtain wall, even if the walls are made of concrete and bricks. The main idea is to separate the load-bearing structure from the facade. This means there as few places as possible where heat can be transferred from inside to outside and outside to inside, which saves money on AC and heating. The curtain wall can have any form of decoration you want, and sometimes it can even serve a purpose, in filtering sunlight or protecting privacy.

But contemporary architecture is struggling with the same problems as that of the ancients: there are so many seams everywhere, and they are problematic. Someone needs to do something. A lot of the perceived uglyness of contemporary construction is about the poor quality and all the issues that arise from unsolved problems. Modernism has become a style, just like classicism, and it has lost its original meaning. It's OK to hate it. But I don't think the article in the OP understands why.

posted by mumimor at 5:27 PM on May 17 [26 favorites]

While wip owner is Libertarian I doubt Sam is a card-carrying New Urbanist (I have designed with the NU crowd); his twt feed is just too diverse, he's not always skeptical... I nearly forgot I'm supposed to filter!

Oh fuck! he's a Tufton Street man. His twit bio has employers as @CPSThinkTank Centre for Policy Studies (CPS, founded 1975 by Sir Keith Joseph Baron - the brains behind Thatcherism [wikipedia] (Along with Patrick Minford). That puts Stripe in the same orbit.

This means all of Works in Progress should be treated as an astroturf. As a designer Works in Progress and its contents is VERY attractive. I think it will suck a lot of people in.

CPS is very tightly aligned with the Christian fundamentalist US Council for National Policy [ <a href="http://splcenter.org" rel="nofollow">splcenter.org</a> ] - (founded 1981). Link has a whole rogues gallery so CW applies.

Re the fall of bulding ornament: I suspect Samuel Hughes has deliberately omitted (as he is thorough - and pedantic) the deeper real financial reason (article has an odd two-thrtead structure when I read it again). I put this on his tweet but during my degree (and since) I've dug deeply into an argument by James Russell in a June 2003 Architectural Record article Leading the Money. [ I have a .pdf as it's extremely hard to find] Russell cites Chris Leinberger @ChrisLeinberger:

"The real difference between the prewar era and now, he contends, is that investors then expected to reap their rewards over a very long time - and did.".

Leinberger (who seems a very secular and anti Trump pewrson - which makes me feel better for New Urbanism as opposed to the CSP) was doing interesting developments in Albuquerque at the time based on treating buildings as nested tranches with different-age returns, in order to set a building up (like a pre-1930's one) where it would be worthwhile upgrading every 30+ years. And to invest more money into a higher street-facing facade/frontage, and gain a longer, higher-level lease from this finer ornamentation.

posted by unearthed at 9:29 PM on May 17

The article makes a lot more sense in light of the political stuff. Hughes is anti-modernity, aesthetically and politically.
posted by vitia at 9:51 PM on May 17 [1 favorite]

« Older ‘He likes scaring people’   |   Teruna Jaya (gamelan animated graphical score) Newer »

Read the whole story
sarcozona
4 hours ago
reply
Epiphyte City
Share this story
Delete

The beauty of concrete - Works in Progress

1 Share

One of the unifying features of architectural styles before the twentieth century is the presence of ornament. We speak of architectural elements as ornamental inasmuch as they are shaped by aesthetic considerations rather than structural or functional ones. Pilasters, column capitals, sculptural reliefs, finials, brickwork patterns, and window tracery are straightforward examples. Other elements like columns, cornices, brackets, and pinnacles often do have practical functions, but their form is so heavily determined by aesthetic considerations that it generally makes sense to count them as ornament too.

Ornament is amazingly pervasive across time and space. To the best of my knowledge, every premodern architectural culture normally applied ornament to high-status structures like temples, palaces, and public buildings. Although vernacular buildings like barns and cottages were sometimes unornamented, what is striking is how far down the prestige spectrum ornament reached: our ancestors ornamented bridges, power stations, factories, warehouses, sewage works, fortresses, and office blocks. From Chichen Itza to Bradford, from Kyiv to Lalibela, from Toronto to Tiruvannamalai, ornament was everywhere.

Since the Second World War, this has changed profoundly. For the first time in history, many high-status buildings have little or no ornament. Although a trained eye will recognize more ornamental features in modern architecture than laypeople do, as a broad generalization it is obviously true that we ornament major buildings far less than most architectural cultures did historically. This has been celebrated by some and lamented by others. But it is inarguable that it has greatly changed the face of all modern settlements. To the extent that we care about how our towns and cities look, it is of enormous importance.

The naive explanation for the decline of ornament is that the people commissioning and designing buildings stopped wanting it, influenced by modernist ideas in art and design. In the language of economists, this is a demand-side explanation: it has to do with how buyers and designers want buildings to be. The demand-side explanation comes in many variants and with many different emotional overlays. But some version of it is what most people, both pro-ornament and anti-ornament, naturally assume.

However, there is also a sophisticated explanation. The sophisticated explanation says that ornament declined because of the rising cost of labor. Ornament, it is said, is labor-intensive: it is made up of small, fiddly things that require far more bespoke attention than other architectural elements do. Until the nineteenth century, this was not a problem, because labor was cheap. But in the twentieth century, technology transformed this situation. Technology did not make us worse at, say, hand-carving stone ornament, but it made us much better at other things, including virtually all kinds of manufacturing and many kinds of services. So the opportunity cost of hand-carving ornament rose. This effect was famously described by the economist William J Baumol in the 1960s, and in economics it is known as Baumol’s cost disease.

To put this another way: since the labor of stone carvers was now far more productive if it was redirected to other activities, stone carvers could get higher wages by switching to other occupations, and could only be retained as stone carvers by raising their wages so much that stone carving became prohibitively expensive for most buyers. So although we didn’t get worse at stone carving, that wasn’t enough: we had to get better at it if it was to survive against stiffer competition from other productive activities. And so the labor-intensive ornament-rich styles faded away, to be replaced by sparser modern styles that could easily be produced with the help of modern technology. Styles suited to the age of handicrafts were superseded by the styles suited to the age of the machine. So, at least, goes the story.

This is what economists might call a supply-side explanation: it says that desire for ornament may have remained constant, but that output fell anyway because it became costlier to supply. One of the attractive features of the supply-side explanation is that it makes the stylistic transformation of the twentieth century seem much less mysterious. We do not have to claim that – somehow, astonishingly – a young Swiss trained as a clockmaker and a small group of radical German artists managed to convince every government and every corporation on Earth to adopt a radically novel and often unpopular architectural style through sheer force of ideas. In fact, the theory goes, cultural change was downstream of fairly obvious technical and economic forces. Something more or less like modern architecture was the inevitable result of the development of modern technology.

I like the supply-side theory, and I think it is elegant and clever. But my argument here will be that it is largely wrong. It is just not true that twentieth-century technology made ornament more expensive: in fact, new methods of production made many kinds of ornament much cheaper than they had ever been. Absent changes in demand, technology would have changed the dominant methods and materials for producing ornament, and it would have had some effect on ornament’s design. But it would not have resulted in an overall decline. In fact, it would almost certainly have continued the nineteenth-century tendency toward the democratization of ornament, as it became affordable to a progressively wider market. Like furniture, clothes, pictures, shoes, holidays, carpets, and exotic fruit, ornament would have become abundantly available to ordinary people for the first time in history.

In other words, something like the naive demand-side theory has been true all along: to exaggerate a little, it really did happen that every government and every corporation on Earth was persuaded by the wild architectural theory of a Swiss clockmaker and a clique of German socialists, so that they started wanting something different from what they had wanted in all previous ages. It may well be said that this is mysterious. But the mystery is real, and if we want to understand reality, it is what we must face.

Manufacturing ornament before modernity

The supply-side theory has two parts: a story about how ornament was handcrafted before modernity, and a story about how this wasn’t compatible with rising labor costs. Strikingly, a part of the first story is untrue: far from relying on bespoke artisanal work, many premodern builders used certain kinds of mass production whenever they could. But overall, the supply-side story is still an accurate description of this period: although premodern builders used labor-saving methods where possible, their opportunities for doing so were limited by low populations, low incomes, and poor transport technology, and until modern times, making ornament really was pretty labor-intensive.

There are two main methods of making ornament: carving and casting.

Carving involves removing material until only the desired form remains; casting involves shaping a material into the desired form while it is soft and then hardening it. Not all architectural ornament is produced in these ways (for example, wrought ironwork and ornamental brickwork are not), but a surprisingly high proportion is, so I shall focus on these two methods here.

First, carving. From the Renaissance to the nineteenth century, the creation of carved ornament went through several stages in a method called indirect carving. First, a design for the ornament was hand drawn by an architect and modeled in clay by a specialist craftsman called an architectural modeler. Because clay models fall apart when they dry out, it might then be cast in plaster for durability. The design would then be laboriously transferred to a block of stone or wood using something called a pointing machine, a framework of needles calibrated to points on the model so that they show exactly how much of the stone or wood has to be drilled and chiseled away to replicate its form (search YouTube for ‘pointing machine’ to find many videos of these). This carving work was done by hand by a second group of skilled craftsmen. The actual designers would probably never touch either the model or the final product.

Even figure sculpture was produced using a version of this method: the sculptor would model the statue in clay, then craftsmen would transfer the design to stone, often via an intermediate plaster cast. The indirect carving of sculpture dates back to antiquity, and many of the most famous antique statues are Roman copies of Greek originals, including the Apollo Belvedere and the Venus de Medici. Indirect carving faded away in the Middle Ages but was revived in the Renaissance and improved steadily in the following centuries. Initially, indirect carving was used to get the figures roughly right, after which the sculptor would take over to execute the details. But by the later eighteenth century, pointing machines were so good that many sculptors did little work on the actual statue: sculpting was basically an art of modeling in clay, and carving was a sophisticated but largely mechanical process. Canova, Thorvaldsen, and Rodin all worked this way. The stone sculptures that adorn the centers of old Euro­pean and American cities are mostly stone copies of plaster copies of long-lost clay originals.

Indirect carving enables a limited sort of mass production. It makes it possible to get far more out of one scarce factor of production, namely talented designers. This has some value with figure sculpture: there seem to have been carving factories in the Roman Empire mass-producing copies of the most admired statues. But it really comes into its own with other architectural ornament. The Palace of Westminster is covered with tens of thousands of square meters of extraordinarily ingenious and coherent ornament. This is not because Victorian London was awash with carver- sculptors of genius. It is because virtually every detail of the enormous building, down to the last molding profile, was designed by one man, the strange and brilliant Augustus Pugin. Pugin carved nothing, but he produced an immense flood of drawings, which were executed in stone and wood by numberless other hands. Indirect carving made Pugin many thousands of times more productive than he could have been otherwise.

The prevalence of indirect carving shows that premodern builders were keen to rationalize the production process where possible. But the sketch above also shows how labor-intensive carving remained. Premodern machinery had allowed a tiny number of elite architects to design a relatively huge amount of ornament. But the rest of the carving process was largely manual and bespoke as late as the nineteenth century, using much the same tools as the ancient Greeks, and requiring a huge workforce. Perhaps surprisingly, technology revolutionized the productivity of the creative artist long before it revolutionized any other part of the production chain.

Cast ornament shows the same pattern, with some limited mechanization accompanying persistent labor-intensiveness. Cast ornament is made of materials that are originally soft, or that can be made so temporarily through heating or mixing with water. Up to the nineteenth century, the principal materials for cast ornament were clay and plaster, while bronze was the preferred material for cast sculpture. The process of making cast ornament would begin in the same way as that of carved ornament, with drawings and often models. Molds would then be carved in wood or cast from the models in metal, plaster, or gelatine. The mold would then be used to shape the material. There are various ways of doing this, depending on the casting material and the complexity of the ornament.

Some kinds of mold are destroyed in the casting process, but most are reusable many times. And while some casting materials (e.g., bronze) are expensive, others (e.g., clay and plaster) are cheap once the infrastructure for producing them is in place. So once the initial investment in kilns and molds is made, large quantities of cast ornament can be produced at low marginal cost. This means that mass production of ornament has been theoretically possible since very early times.

Despite this, factory production of ornament did not become general practice until the nineteenth century. The reason for this is presumably that markets were so small that these economies of scale could not be realized. Today, much of the best cast ornament in Britain comes from a factory near Northampton run by a company called Haddonstone, whose products I return to below. Haddonstone has customers dispersed fairly evenly across Britain, and it also exports to Ireland, Continental Europe, the Middle East, and the United States. In a premodern economy, with fantastically high transport costs, its market would have been far smaller, perhaps indeed just the town of Northampton – and because premodern societies were extremely poor, Northampton would have been an even smaller market than it is now.  Instead of a potential market of millions of new buildings annually, its potential market could easily have been in single digits. It is highly improbable that the fixed costs of factory production would be worthwhile under these conditions.

The upshot of this is that premodern cast ornament was seldom able to exploit its natural scalability. The cheap cast materials probably always tended to be cheaper than stone carving, but this advantage was not marked, and many premodern societies used carved stone for a wide range of public buildings. In many times and places, wood ornament, which is much easier to carve than stone, was used in common buildings. This suggests it was competitive against plaster and terracotta even at the most budget end of the premodern market for ornament.

In its essentials, the supply-side story is thus true of premodern ornament, even though the romantic idea that every piece of premodern ornament is an original work of art is largely inaccurate. Nearly all premodern ornament was mechanically copied in some way, and some premodern manufacturing methods could in theory have been scaled up to mass production. The claim that modern mechanically produced ornament is distinctively inauthentic or uncreative is highly dubious: mechanical copying has been widespread for many centuries. But premodern copying industries were themselves small-scale and labor intensive, and it is plausible that ornament was only widely used in these societies because labor was so affordable.

Manufacturing ornament in modernity

The supply-side story says that these labor-intensive industries failed to evolve in modernity, and so lost out to competition from industries that did. But the first claim here just isn’t true: in fact, the manufacture of ornament was revolutionized in the nineteenth and twentieth centuries. Three changes are worth drawing out.

First, inventive toolmakers mechanized the carving process. This is only a qualified truth in the case of stone carving. By the early twentieth century, sophisticated planing machines were capable of cutting simple moldings, column shafts, and so forth with little or no manual finishing work. However, more complex ornaments continued to be carved by hand. A planing machine works by gradually sanding down a block, wearing off material through abrasion until the desired profile is left. This means it is good for producing ornaments that consist essentially of a single profile extended in one dimension. But it cannot easily produce ornaments with undercutting (i.e., drooping projections), and it certainly cannot produce complex multidimensional ornaments like Corinthian capitals or Gothic pinnacles.

In fact, stonework is only finally being mechanized today. I recently visited what is probably the world’s most advanced factory for cutting stonework with a computer-controlled machine, Monumental Labs in New York City. Monumental Labs has constructed a robot that scans a model and then carves it from blocks of stone. The robot works about two to four times faster than a stone carver, and of course it works nonstop, meaning that its overall productivity is 6-12 times greater. It is capable of executing about 95 percent of the carving process, even for figure sculpture, where exact precision is particularly important. Unsurprisingly, Monumental Labs is quickly capturing market share from rivals who still do much of the work with pointing machines and hand carving. Over the next few years, they may succeed in finally mechanizing the process of stone carving. But this is only happening in the 2020s, after natural stone carving has undergone a long decline. So with respect to stonework, the supply-side story may have some validity.

In the case of woodwork, however, mechanization was extraordinarily successful. Two key innovations were steam-powered milling machines and lathes in the nineteenth century. A milling machine has spinning cutters shaped like the negative of the desired profile of the molding. When a beam of wood is passed through it, the cutters remove exactly the correct volume of wood, and an essentially finished ornament emerges on the other side, with many hours of manual carving work completed in seconds. A lathe works on a modification of the same principle: the piece of wood is spun, and the blade is held steady. It is used for things like balusters and columns. Lathes, unlike milling machines, had existed before the Industrial Revolution, but steam made them much more powerful.

In Europe, the effect of these advances was obscured by fire safety laws that tended to ban woodwork on the exterior of urban buildings. But such laws were generally absent in the United States, where there was thus an enormous proliferation of ornamental woodwork in the late nineteenth century, a process bound up with the popularity of what Americans call the ‘Queen Anne’ and Eastlake styles. The ban on exterior woodwork was also lifted in England in the 1890s, resulting in a revival of woodwork decoration that is so characteristic of Edwardian houses, and that makes many Edwardian neighborhoods so much more cheerful than their Victorian predecessors. Although these machines could not generate every kind of woodwork (unlike the astonishing computer-controlled machines, known as CNC machines, that have been developed since), their range was much wider than that of the corresponding machines for stone carving.

The second change revolutionizing ornament manufacture was that scientific advances improved the available materials. Improvements in metallurgy dramatically reduced the cost of cast iron in the early nineteenth century, and its use spread rapidly thereafter. New York City even went through a brief phase of making commercial buildings entirely from iron, many of which survive in SoHo. This proved to have practical problems like overheating, but adding cast iron ornament to masonry buildings became common in many places. Some cities, like Sydney and Melbourne, became especially known for their traditions of cast ironwork.

Another important material is cast stone. Cast stone is a kind of concrete, made by crushing stone, mixing the fragments (called aggregate) with a smaller quantity of cement as a binder, and then casting it in a mold. The crushed stone gives it an appearance resembling natural stone, an effect that is often augmented by mechanically tooling or etching the surface. Good cast stone is remarkably plausible: essentially no layperson would notice that it is not ‘real’, and even a specialist may struggle to tell if it is hoisted 80 feet up a facade. Simple molds are usually machine-carved in wood, and complex three-dimensional ones are themselves cast in gelatine or, today, silicone.

Although there were earlier concretes that bore some resemblance to stone, plausible cast stone seems to have emerged only in the last quarter of the nineteenth century. It became widely used in the United States in the early twentieth century, and many key public buildings in American cities made use of it. Because simple shapes had become easy to carve in stone mechanically, architects sometimes faced the bulk of the facade with natural stone and used cast stone only for the ornament.

While researching this article, I visited the factory of the cast stone manufacturer Haddonstone in Northampton. With the help of the classical architect Hugh Petter, Haddonstone has recently constructed molds based on the designs of the eighteenth-century architect James Gibbs. The molds are filled on a conveyor belt, left to dry overnight, and then opened up in minutes. So it is now possible to buy perfectly proportioned classical ornament, nearly indistinguishable from stone, that has – if the molds and the factory infrastructure are treated as a given – taken only minutes of labor to produce. This sort of capacity is only gradually reemerging, stimulated by the revival of classical architecture, but it was once widespread. Haddonstone is currently manufacturing cast stone ornament for Nansledan, the vernacular-style urban extension to Newquay supported by the King.

The third process was the enormous expansion in the available markets, and the economies of scale that this generated. In the nineteenth century the volume of construction increased tremendously, and transport networks were vastly improved.

It is well-known that railways cut travel times a great deal, perhaps by four fifths relative to stagecoaches by the late nineteenth century. But this vastly understates the transport improvements, because stagecoach speeds themselves improved dramatically during the turnpike (toll road) building boom of the previous century, as did freight via canals.

A stagecoach fare between London and Brighton, 47 miles as the crow flies, varied between 276 and 144 pence in the early nineteenth century, equating to a per-mile-traveled cost of perhaps 2 pence. By the 1880s, first-class rail travel in Britain cost an average of about 0.15 pence per mile traveled. This suggests a fall in overland per-tonne per-mile freight costs in the order of 95 percent, for a service that had also grown five times faster. This meant that the markets available to manufacturers located anywhere with railway access grew far larger, favoring those materials like stucco and terracotta whose per-unit costs dropped a lot when they were produced at scale. In the 1930s, just as ornament was starting to decline, transport costs were vertiginously cut again, this time by the development of modern trucking.

Manufacturers naturally took full advantage of this, developing an extensive system of factory production for these methods. For example, the market for architectural terracotta in the United States came to be dominated by just a few huge firms, each of which apparently commanded a near monopoly over thousands of miles. Almost the entire Pacific market was served by a firm called Gladding McBean, whose factory was in Lincoln, California; the Midwest was dominated by a Chicago firm confusingly called the Northwestern Terra Cotta Company; the East Coast was dominated by a New Jersey firm called, more intuitively, the Atlantic Terra Cotta Company. This state of affairs would have been unthinkable just decades earlier, when freight could be carried overland only by carts and pack animals.

A less important but still significant factor was the emergence of extremely large individual buildings. Most early twentieth-century skyscrapers actually had a complete set of ornament modeled for them bespoke, but the buildings were so enormous that substantial economies of scale were still achieved. This is one reason why terracotta was such a popular material for skyscrapers in interwar America, a component of American Art Deco that has now become a striking part of its visual identity.

The democratization of ornament

On the one hand, we have the increasing cost of labor; on the other, we have the fact that less labor was necessary per unit of ornament. Which effect was stronger? For the period from the start of the Industrial Revolution to the First World War, the answer should be obvious to anyone walking the streets of an old European city. The vernacular architecture of the seventeenth or eighteenth century tends to be simple, with complex ornament restricted to the homes of the rich and to public buildings. In the nineteenth-century districts, ornament proliferates: even the tenement blocks of the poor have richly decorated stucco facades.

The revealed evidence is in fact overwhelming that the net effect between, say, 1830 and 1914 was mainly one of greater affordability. To be sure, the ornament of the middle and working classes was of stucco, terracotta, or wood, not stone, and it was cast or milled in stock patterns, not bespoke. These features occasioned much censoriousness and snobbery at the time. But we might also see them as bearing witness to the democratizing power of technology, which brought within reach of the people of Europe forms of beauty that had previously belonged only to those who ruled over them.

What about the period since 1914? Did the economic tide turn against the affordability of ornament? The evidence here is more complex. Over the course of the 1920s and 1930s ornament gradually vanished from the exteriors of many kinds of architecture, though at different rates in different countries and for different types of building. In the decades since, it has seen only limited and evanescent revivals. But we still have good evidence that this change was not really driven by growing unaffordability.

The reason is that there are some relatively budget pockets of the market where ornament has remained pretty common. Virtually any like-for-like comparison of an elite building from 1900 and today will show a huge reduction in ornament. Indefinitely many comparisons are possible, but there is one on the previous page, between a British Government office from the Edwardian period and one from the early 2000s.

We could run the same sort of comparison for any two banks, corporate headquarters, parliaments, concert halls, universities, schools, art galleries, or architect-designed houses, and with occasional exceptions we would find the same pattern. But if we try to run it for mass-market housing, we get a more uncertain result. On the previous page are promotional images for mass-market British houses in the 1930s and today. What is striking is how similar they are. Both have carved brackets, molded bargeboards, faux leaded windows, paneled wooden doors, patterned hung tiles, and decorative brickwork. The modern houses have UPVC windows rather than wooden ones, and they are more likely to have garages. Otherwise, they haven’t really changed. The interiors of the modern homes mostly lack the molded cornices of the 1930s ones, but many of them still have molded skirting boards, fielded door panels, and molded door surrounds.

Browsing the website of any major British housebuilder will confirm that, although the quantity of ornament in mass-market housing probably has declined somewhat since the early 1900s, it has declined much less than that of any other build type. This pattern is even more visible in the United States. But this is exactly the opposite of what the supply-side theory would predict.

The supply-side theory says that ornament declined because it became prohibitively expensive, which suggests that it would vanish from budget housing first and gradually fade from elite building types later. In fact, budget housing is almost the only place we find it clinging on. 

The obvious explanation is that ornament survives in the mass-market housebuilder market because the people buying new-build homes at this price point are less likely to be influenced by elite fashions than are the committees that commission government buildings or corporate headquarters. The explanation, in other words, is a matter of what people demand, not of what the industry is capable of supplying: ornament survives in the housing of the less affluent because they still want it. 

An interesting special case here is the McMansion, the one really profusely ornamented type of housing that still gets built fairly often in some countries. McMansions are built for people who have achieved some level of affluence, but who stubbornly retain a non-elite love of ornamentation. They inspire passionate contempt in many sophisticated critics, to whom they afford a rare opportunity to flex cultural power without looking as though one is being nasty to poor people. McMansions illustrate how easily wealthy people and institutions could ornament their buildings if they wanted to. But, perhaps with that passionate contempt in mind, most of them no longer do.

According to the supply-side theory, the story of ornament in modernity is one of ancient crafts gradually dying out as they became economically obsolete. I have told a different story. In the nineteenth and early twentieth centuries, the production of ornament was revolutionized by technological innovation, and the quantity of labor required to produce ornament declined precipitously. Ornament became much more affordable and its use spread across society. An immense and sophisticated industry developed to manufacture, distribute, and install ornament. The great new cities of the nineteenth century were adorned with it. More ornament was produced than ever before.

We can imagine an alternative history in which demand for ornament remained constant across the twentieth century. Ornament would not have remained unchanged in these conditions. Natural stone would probably have continued to decline, although a revival might be underway as robot carving improved. Initially, natural stone would have been replaced by wood, glass, plaster, terracotta, and cast stone. As the century drew on, new materials like fiberglass and precast concrete might also have become important. Stock patterns would be ubiquitous for speculative housing and generic office buildings, but a good deal of bespoke work would still be done for high-end and public buildings. New suburban housing might not look all that different from how it looks today, but city centers would be unrecognizably altered, fantastically decorative places in which the ancient will to ornament was allied to unprecedented technical power.

This was not how it turned out. In the first half of the twentieth century, Western artistic culture was transformed by a complex family of movements that we call modernism, a trend that extends far beyond architecture into the literature of Joyce and Pound, the painting of Picasso and Matisse, and the music of Schoenberg and Stravinsky. Between the 1920s and the 1950s, modernist approaches to architecture were adopted for virtually all public buildings and many private ones. Most architectural modernists mistrusted ornament and largely excluded it from their designs. The immense and sophisticated industries that had served the architectural aspirations of the nineteenth century withered in full flower. The fascinating and mysterious story of how this happened cannot be told here. But it is a story of cultural choice, not of technological destiny. It was within our collective power to choose differently. It still is.

Read the whole story
sarcozona
4 hours ago
reply
Epiphyte City
Share this story
Delete

New Answers for Mars' Methane Mystery - Universe Today

1 Share

Planetary scientists perk up whenever methane is mentioned. Methane is produced by living things on Earth, so it’s considered to be a potential biosignature elsewhere. In recent years, MSL Curiosity detected methane coming from the surface of Gale Crater on Mars. So far, nobody’s successfully explained where it’s coming from.

NASA scientists have some new ideas.

Ever since Curiosity landed on Mars in 2012, it’s been sensing methane. But the methane displays some odd characteristics. It only comes out at night, it fluctuates with the seasons, and sometimes, the amount of methane jumps to 40 times more than the regular level.

The ESA’s ExoMars Trace Gas Orbiter entered a science orbit around Mars in 2018, and scientists fully expected it to detect methane in the planet’s atmosphere. But it didn’t, and it has never been detected elsewhere on Mars’ surface.

If life was producing the methane, it appears to be restricted to the subsurface under Gale Crater.

There’s no convincing evidence that life exists on Mars. It may have in the past, and it’s possible that some extant life clings to a tenuous existence in subsurface brines or something. But we lack evidence, so life is basically ruled out as the methane source. Especially since the evidence shows life would have to be under Gale Crater and nowhere else.

Scientists have been trying to determine the source of methane, but so far, they haven’t come up with a specific answer. It has something to do with subsurface geological processes involving water, most likely.

“It’s a story with a lot of plot twists,” said Ashwin Vasavada, Curiosity’s project scientist at NASA’s Jet Propulsion Laboratory in Southern California, which leads Curiosity’s mission.

Alexander Pavlov is a planetary scientist at NASA’s Goddard Space Flight Center who leads a group of NASA scientists studying the Martian Methane Mystery. In recent research, they suggested that the methane is stored underground. They didn’t explain what produced it, but they showed that methane can be sealed underground by salt solidified in the Martian regolith.

They suggested that the methane could be released from its subsurface reservoir by the weight of the Curiosity rover itself. The rover’s weight could break the salt seal and release methane in puffs. That’s an interesting proposition, but it doesn’t explain the seasonal and diurnal fluctuations. That makes sense since the Gale Crater is one of only two regions where a rover is working. The other is Jezero Crater, where the Perseverance Rover is working, but it doesn’t have a methane detector. (Neither will the ESA’s Rosalind Franklin rover, which is scheduled to land on Mars in 2029.)

The research group addressed those fluctuations by suggesting that seasonal and daily heating could also break the seal and release methane.

Their potential explanations stem from research Pavlov conducted in 2017. He grew bacteria called halophiles, which grow in salty conditions, in simulated Martian permafrost. The simulated soil was infused with salt, replicating conditions on much of Mars. The microbe growth was inconclusive, but the researchers noticed something else. As the salty ice sublimated, a layer of solidified salt remained, forming a crust.

“We didn’t think much of it at the moment,” Pavlov said.

But he remembered it when MSL Curiosity detected an unexplained burst of methane on Mars in 2019.

“That’s when it clicked in my mind,” Pavlov said. Then, he and a team of researchers began testing conditions that could form the hardened salt seals and then break them open.

Perchlorate is a chemical salt that’s widespread on Mars. Pavlov and his fellow researchers recreated different simulated Martian permafrosts with varying amounts of perchlorate. Inside a Mars simulation chamber, they subjected the samples to different temperatures and atmospheric pressures to see if they would form seals.

In their experiments, they used neon as a methane analog and injected it under the soil. Then, they measured the gas pressure below and above the soil. They found that the pressure was higher under the soil, meaning the gas was being trapped by the salty permafrost. Furthermore, they found that seals formed in samples containing as little as 5% or 10% perchlorate, and they formed within 3 to 13 days. Those are compelling results.

While 5-10% perchlorate doesn’t sound like much, it’s actually a higher concentration than in Gale Crater, where the methane has been detected. But perchlorate isn’t the only salt in Martian regolith. It also contains sulphates, another type of salt mineral. Pavlov says he and his team will test sulphates next for their ability to form a seal.

The Martian Methane Mystery is commanding a lot of attention. It’s a juicy mystery, and once it’s solved, our understanding of methane as a biosignature or false positive will be much improved. NASA’s 2022 Planetary Mission Senior Review recommended that the issue of methane production and destruction at Mars be investigated further.

The type of work that Pavlov and his colleagues are doing is important, but it’s being held back. Pavlov says that they need more consistent methane measurements. The problem is that Curiosity’s SAM (Sample Analysis at Mars) instrument, which senses the methane, is busy with other tasks. It only checks for methane a few times per year. It’s mostly occupied with drilling samples and testing them, a critical and time-consuming part of the rover’s mission.

“Methane experiments are resource intensive, so we have to be very strategic when we decide to do them,” said Goddard’s Charles Malespin, SAM’s principal investigator.

Curiosity’s mission wasn’t designed to measure methane fluctuations. In 2017, NASA said its SAM instrument only sampled the atmosphere 10 times in 20 months. That’s a very inconsistent sample that leaves lots of unanswered questions.

Scientists think another mission is needed to advance their understanding of Martian methane. Rather than one sensor taking irregular methane readings from one location, we need multiple testing stations on the surface that regularly monitor the atmosphere. Nothing like it is in the works.

“Some of the methane work will have to be left to future surface spacecraft that are more focused on answering these specific questions,” Vasavada said.

Read the whole story
sarcozona
4 hours ago
reply
Epiphyte City
Share this story
Delete

Hubble Telescope Faces Threat From SpaceX and Other Companies’ Satellites - The New York Times

1 Comment
Read the whole story
sarcozona
5 hours ago
reply
Space X is not a net benefit
Epiphyte City
Share this story
Delete

Honesty About Covid is Essential for Progress - John Snow Project

1 Share

One of the criticisms often leveled at members of the Covid-cautious community is that they believe ‘everything is Covid.’ Critics say there is an element of alarmism or neurosis in the concerns this community has about COVID-19 because no pathogen could cause all the harms being laid at its door.

Unfortunately, the newest widely circulating pathogen in the human population uses a broadly expressed ACE2 receptor to infect cells1, meaning it can damage almost any part of the body2. Prior to the COVID-19 pandemic, few people believed coronaviruses could linger in the body, but members of the John Snow Project outlined their concerns in 2021 because there was extensive evidence going back decades to suggest coronaviruses could persist3,4. These concerns have since been shown to be justified, with numerous studies now demonstrating prolonged viral persistence and immune activation5-9.

The combination of a widely expressed receptor and persistent infection means the acute and long-term effects of SARS-CoV-2, the virus which causes COVID-19, can be unpredictable10.

SARS-CoV-2 has also been shown to harm the immune system in various ways11-14, many of which are common to other pathogens. This harm seems to have increased susceptibility to other pathogens such as dengue15 and strep A16,17.

We’ve previously written about government efforts to return to pre-2020 norms and how official messaging that we must all assess our own level of risk has been interpreted by most people to mean that it is safe to engage with the world in the same way one would have done in 2019 and that there will be no additional risk in doing so18.

Most people have resumed pre-pandemic behaviours, but there has been an increase in general ill-health, which can be demonstrated in rising levels of long-term illness19, disability20,21, GP appointments22, chronic absence among school pupils23-27, rising absence among teachers28 and worker shortages in a wide range of industries29. Many commentators theorize about the reasons for these phenomena, blaming a mysterious malaise among workers, indulgent or irresponsible parents, or post-lockdown laziness.

Aiding this apparent mystery is the rather bizarre way in which official figures are reported. A prominent Covid-cautious commentator pointed this out in a thread on X in relation to the UK Office of National Statistics figures on Long Covid30. The ONS analysis states, “The majority of people self-reporting long COVID experienced symptoms over two years previously,” but the way the  data is presented skews the risk towards historic Covid-19 cases by using uneven time intervals, a practice which is in breach of UK government policy on how to present time series data31. However, when the data is presented as close to correctly as the raw data allows, the risk of developing Long Covid from a COVID-19 infection seems to remain relatively constant. 

Another criticism leveled at the Covid-cautious community is that members are overstating the risk of Long Covid. High quality studies from all over the world point to the very real and significant risk of Long Covid32-34, and there is now evidence to suggest the risk of Long Covid rises with each subsequent infection35.

If anything, Long Covid prevalence is likely to be understated because of the dearth of public health information from official sources. There are still some people who are surprised they can be reinfected by SARS-CoV-2. There are others who know about the risk of reinfection but who falsely believe each subsequent infection will be milder. There are yet more who do not know each infection can carry a risk of long-term illness.

When we get into specifics, how many people know COVID-19 infection can cause headaches and migraines weeks or months later36,37? Or that it can cause fainting38,39? Nausea40? Heart attacks41,42? Cardiac complications in adults and children43,44? Embolisms45? ADHD-like symptoms46,47? Neurological issues48,49? How many people are suffering the long-term sequelae of COVID-19 infection but not drawing the causal link and instead ascribing their new conditions to bad luck or aging?

We’ve previously written about governments creating the space for antivaxx messaging to thrive by not correctly reporting the risks of COVID-19 infection29, but there are greater threats. Every time a Covid-minimizer says, “There’s nothing to worry about, look at everybody else out there living their lives, just resume your old ways,” they are undermining faith in public health measures because their reassurance is based not on the scientific evidence but on instinct, hope and, possibly, a vested interest in maintaining the status quo because they staked their professional credibility on infections being protective. Science and public health progress when we follow the evidence, not when we hold hunches and opinions in higher esteem than evidence.

The huge rise in dengue50, coupled with the evidence that dengue virus uses SARS-CoV-2 antibodies to enhance infection15 and the correlation of COVID-19 cases to dengue cases51 suggests there is an interplay between the pathogens that hasn’t been fully understood. Whooping cough is surging in the UK, with cases up 3,800% on previous years52-55, and adults who have been previously vaccinated or infected are now falling seriously ill. Similar surges have been seen in other countries, and while those who like to blame anything-but-Covid point the finger at lockdowns, which ended more than three years ago in most countries, sensible people would like to understand the interplay between COVID-19 infection and susceptibility to other pathogens.

It would only make sense to pursue ignorance if there was nothing that could be done about COVID-19, but we know that clean air policies can reduce the risk of all infections56, be they bacterial, viral or fungal. The “just get on with it” messaging of those who want people to forget about COVID-19 is a celebration of the sort of ignorance that has slowed and stalled human progress throughout history.

If there is a business case for investment in engineering and architecture that will improve human health, we need to properly understand the harms caused by COVID-19. Sweeping it under the rug, shouting down those with legitimate concerns, pretending the virus doesn’t exist, massaging data to make it appear things are safe, are all counter to this understanding.

It seems those with means have already decided their health will benefit from clean air57, and advanced ventilation and filtration systems are the latest must-have addition to high-end properties58, which suggests there is also an issue of equity involved in understanding COVID-19. The advancement of human knowledge has always empowered the general population, which is why it has often been resisted by those in power. Keep that in mind the next time someone says, “Stop worrying. Just get on with it.” They want your ignorance and incur no cost if you are harmed by being repeatedly infected by COVID-19 or any other pathogen that might be surging in its wake.

For information on how you can protect yourself from COVID-19 infection, please click here.

Read the whole story
sarcozona
5 hours ago
reply
Epiphyte City
Share this story
Delete
Next Page of Stories