plant lover, cookie monster, shoe fiend
20561 stories
·
19 followers

Critical Atlantic current significantly more likely to collapse than thought | Oceans | The Guardian

1 Share

The critical Atlantic current system appears significantly more likely to collapse than previously thought after new research found that climate models predicting the biggest slowdown are the most realistic. Scientists called the new finding “very concerning” as a collapse would have catastrophic consequences for Europe, Africa and the Americas.

The Atlantic meridional overturning circulation (Amoc) is a major part of the global climate system and was already known to be at its weakest for 1,600 years as a result of the climate crisis. Scientists spotted warning signs of a tipping point in 2021 and know that the Amoc has collapsed in the Earth’s past.

Climate scientists use dozens of different computer models to assess the future climate. However, for the complex Amoc system, these produce widely varying results, ranging from some that indicate no further slowdown by 2100 to those suggesting a huge deceleration of about 65%, even when carbon emissions from fossil fuel burning are gradually cut to net zero.

The research combined real-world ocean observations with the models to determine the most reliable, and this hugely reduced the spread of uncertainty. They found an estimated slowdown of 42% to 58% in 2100, a level almost certain to end in collapse.

The Amoc is a major part of the global climate system and brings sun-warmed tropical water to Europe and the Arctic, where it cools and sinks to form a deep return current. A collapse would shift the tropical rainfall belt on which many millions of people rely to grow their food, plunge western Europe into extreme cold winters and summer droughts, and add 50-100cm to already rising sea levels around the Atlantic.

Dr Valentin Portmann, at the Inria Centre de recherche Bordeaux Sud-Ouest in France and who led the new research, said: “We found that the Amoc is going to decline more than expected compared to the average of all climate models. This means we have an Amoc that is closer to a tipping point.”

Prof Stefan Rahmstorf, at the Potsdam Institute for Climate Impact Research in Germany, said: “This is an important and very concerning result. It shows that the ‘pessimistic’ models, which show a strong weakening of the Amoc by 2100, are, unfortunately, the realistic ones, in that they agree better with observational data.”

He added: “I now am increasingly worried that we may well pass that Amoc shutdown tipping point, where it becomes inevitable, in the middle of this century, which is quite close.”

Rahmstorf, who has studied the Amoc for 35 years, has said a collapse must be avoided “at all costs”. “I argued this when we thought the chance of an Amoc shutdown was maybe 5%, and even then we were saying that risk is too high, given the massive impacts. Now it looks like it’s more than 50%. The most dramatic and drastic climate changes we see in the last 100,000 years of Earth history have been when the Amoc switched to a different state.”

The Amoc is slowing because air temperatures are rising rapidly in the Arctic because of global heating. That means the ocean cools more slowly there. Warmer water is less dense and therefore sinks into the depths more slowly. This slowing allows more rainfall to accumulate in the salty surface waters, also making it less dense, and further slowing the sinking and forming an Amoc feedback loop.

The Amoc system is highly complex and subject to random natural variations, making precise predictions impossible. However, a major weakening is now expected by scientists and that alone could have serious impacts in the decades to come.

The new research, published in the journal Science Advances, explored four different ways of using real-world observations to assess the models. They found a method called ridge regression, which had been little used in climate science before now, provided the best results.

The Amoc is difficult to model because it is governed by subtle differences in water density caused by salinity changes over the entire Atlantic. The reduction in uncertainty in the new analysis results from identifying the models that better reflect surface salinity in the south Atlantic, which scientists already knew was important. This makes the work “very credible”, said Rahmstorf.

Rahmstorf said Amoc slowdown in 2100 may be even greater than in the new, pessimistic assessment. This is because the computer models do not include the meltwater from the Greenland ice cap that is also freshening the ocean waters: “That is one additional factor that means the reality is probably still worse.”

Read the whole story
sarcozona
11 hours ago
reply
Epiphyte City
Share this story
Delete

Fascinating 1981 interview with Morris Kline, author of the classic book, Mathematics: The Loss of Certainty

1 Share

From this 1981 interview:

So when did the loss of certainty begin? Where did we take a wrong turn?

It began around 1800, and it began with geometry. I usually like to quote Mark Twain about this. He said that man is the only animal that has the one true religion—several of them. And that is just what happened with geometry.

The geometry that came from the Greeks is usually called Euclidean geometry, after Euclid. But suddenly at the beginning of the 19th century other geometries were developed—non-Euclidean geometries. Who gets the credit for this is sometimes disputed among historians, but I would say Carl Friedrich Gauss. He was the man who said flatly that we can no longer be sure that Euclidean geometry describes the physical world correctly. The various geometries conflict, although one of them, according to thousands of years of tradition, should describe the truth. You can see the problem.

Can you give me an example of an alternative geometry?

Well, one can cite as an example the theorem of Euclidean geometry that the sum of the angles of a triangle is one hundred eighty degrees. In one of the non-Euclidean geometries, called hyperbolic geometry, the sum is less than one hundred eighty degrees; in another, called double-elliptic non-Euclidean geometry, the sum is always larger than one hundred eighty degrees. Yet all of these geometries are equally accurate insofar as man can measure the sums of angles of triangles.

I actually disagree with Kline on that point! Draw a (virtual) triangle connecting the North Pole to two points on the Equator that are 90 degrees apart in latitude, and each angle of that triangle will be 90 degrees. 90 + 90 + 90 = 270: we can measure that.

The interview continues:

What did the mathematicians do when the bottom dropped out of geometry so to speak?

Many mathematicians tried to rescue and maintain as truths the portion of mathematics built on arithmetic, which by 1850 was far more extensive and vital for science than the several geometries. Unfortunately, other shattering events were to follow. Arithmetic and algebra were the next to go by the board.

The best example of this I could give in a semi-popular book was the creation of what are called quaternions, in 1843, by the great mathematical physicist William Rowan Hamilton. Now in the algebra of quaternions, a kind of number known as a hyper-number, multiplication is not commutative. In other words, if I were talking quaternions, I could not say that three times four is the same as four times three. Other strange algebras were created, and it made people start to worry about the laws of ordinary arithmetic. (The one I just stated is known as the commutative law of multiplication). And if we can have perfectly good algebras in which the old familiar laws don’t work, then how do we know they work in the case of the real numbers? That’s where a mathematician named Hermann von Helmholtz stepped in and told us we don’t know it at all. They work in some situations, but not in all.

Are there any elementary examples of these sorts of algebras, where 2 + 2 = 6, or where 5 x 7 = 35, but 7 x 5 is only 34?

I can think of several. Take a quart of water at 40 degrees and mix it with another quart of water at 50 degrees. Do you get two quarts at 90 degrees? You do not. It’s more like 45 degrees. So you can’t just say I’m going to add 40 and 50 and automatically get 90. It depends on the physical situation.

Consider music, a simple musical tone with a unique frequency and amplitude, say one hundred cycles per second. Now suppose on top of that you impose another note at two hundred cycles per second. Do you get a note at three hundred cycles? Again you do not. It is a note of two hundred cycles, the first harmonic above the one-hundred-cycle note. It is the highest harmonic that determines the pitch—two hundred cycles. This is an important factor in the design of musical instruments.

Those are excellent examples. And recall that the laws of probability do not apply in real life (that is, quantum mechanics; see section 2 of this article).

This is interesting:

If mathematics has no underlying truth–if it is filled with contradictions and uncertainties, why does it work?

There is no definitive answer to that. It just works. The only test we have that mathematics is reliable–not certain, but reliable–is that one can apply its laws to physical problems and make predictions. If the predictions come through, then we can say that mathematics has some substantial basis, but not certainty. I think people can’t help being impressed by what mathematics achieves. Consider the problem of sending a spaceship to the moon and bringing it back. It is entirely mathematical. Of course, there is a tremendous amount of engineering involved in the production of the ship, but the entire plan for it is mathematical. We have a theory about the sun, the planets, and more distant heavenly bodies. We say that what makes them behave as they do is the force of gravity. But nobody knows whether there is such a thing as gravity. We have no physical understanding of it. The theory is mathematical–gravity is a scientific fiction.

The same could be said about electricity and magnetism, couldn’t it?

That’s exactly right. Everybody today knows what a radio is, and what a TV is, but nobody knows what a radio wave or a TV wave is. You can’t smell one or hear one or taste one. But we do have a wonderful mathematical theory developed in the nineteenth century by the mathematical physicist James Clerk Maxwell. The evidence for this wonderful theory is the performance of our radio and TV sets. So we have to accept the fact that mathematics works, or else abandon our radios and our TV sets.

I pretty much agree but I’d put it slightly differently. There are various mathematical theories that don’t work, and because of that we don’t use them to design radios and TV sets. To put it another way, it’s not quite right to say that “mathematics” works; rather, some branches of mathematics work. For example, you could think of various goofy variants of logic and probability as mathematics, but nobody’s using them to build ships. Or, for another example, physicists don’t use classical (“Boltzmann”) probability in quantum problems, because . . . it doesn’t work. The mathematics that works, that’s the stuff that works. Any bit of mathematics works until it doesn’t, which is the point where people try to push it beyond its bounds of applicability.

I enjoyed this bit:

Are most mathematicians since the loss of certainty now working on these physical problems?

No, they aren’t. Most of the mathematics created today–maybe ninety percent of it–is a waste of time. That is an opinion, but one that authorities who are far more creative and far better known share with me.
Can you give us an example of mathematics you consider a waste of time?

Some problems now being considered in the theory of numbers, for example, are a waste of time. Take pairs of primes, called double primes. These are prime numbers in a sequence, eleven and thirteen, for example. No even numbers, of course, are primes. Are there an infinite number of these pairs? Are there triple primes? Endless papers are written about these subjects. Who cares?

That’s how I feel too! That said, I understand that the sorts of insights required to solve this sort of number theory problem are cognitively similar to the sorts of insights required to solve what I would consider to be interesting and important problems in mathematics and statistics. So, even though I agree on “Who cares?”, I don’t think that research in this area is “a waste of time,” any more than it’s a waste of time if you’re an athlete to do cross-training.

The interview continues:

It makes mathematics sound a lot like playing chess or bridge. Exciting, beautiful, challenging; the same words apply to all three kinds of activity.

That’s right. I’m glad you suggested it because it makes the point sharper. People enjoy playing chess. Some people even devote their lives to it. But no matter how ingenious a man is at playing chess or bridge, it isn’t going to change this world one iota. Now mathematicians may probe deeper problems, but it is the same thing.

Again, I kinda feel that Kline is missing the point. For one thing, the effort spent to build programs that can win at chess and go has led to general improvements in machine learning and AI. For better or worse, the study of chess has changed the world, and by more than one iota.

Overall, I’m a big fan of Kline and I like a lot of what’s in that interview, which is one reason it’s interesting to see where we disagree.

Read the whole story
sarcozona
16 hours ago
reply
Epiphyte City
Share this story
Delete

Sexual Selection Associated With an Aggressive Male Phenotype Reduces Population Size and Hinders Population Recovery After Heat Stress

1 Comment
Using experiments on soil mites, we show that sexual selection associated with an armed and aggressive male phenotype can reduce population size and stability, which lowers their resilience against acute heat stress. These effects linked with armed and aggressive phenotypes underline the importance of sexual selection in mediating population dynamics and resilience to environmental change. ABSTRACT Population recovery following environmental stress is known to depend on demographic structure, life‐history and evolutionary dynamics. However, it is unclear how traits shaped by sexual selection affect population dynamics and recovery. We examined this by manipulating presence/absence of males expressing either a non‐aggressive ‘scrambler’ phenotype or an aggressive and lethally armed ‘fighter’ phenotype in soil mite populations of different size. We experimentally altered the male phenotype in populations, subjected them to heat stress, and analysed their population dynamics and recovery. We show that populations with fighter males exhibited (i) reduced population size and stability, (ii) greater decline in response to heat stress in larger populations, (iii) higher rate of growth and (iv) incomplete population recovery. Such reduced population stability and recovery linked with armed and aggressive phenotypes underlines the importance of sexual selection in mediating population dynamics and resilience to environmental change with implications for managing natural populations.
Read the whole story
sarcozona
16 hours ago
reply
Mites and men aren’t so different
Epiphyte City
Share this story
Delete

In ML, everyone’s Humpty Dumpty

1 Share

This post is from Bob.

I used to work in natural language semantics, and the following dialogue from Lewis Carroll’s Through the Looking Glass, and What Alice Found There was the most common pull-quote to see at the beginning of a thesis.

“When I use a word,: Humpty Dumpty said in rather a scornful tone, “it means just what I choose it to mean — neither more nor less.”

“The question is,” said Alice, “whether you can make words mean so many different things.”

“The question is,” said Humpty Dumpty, “which is to be master — that’s all.”

Humpty Dumpty came to mind recently after a spate of discussions with ML folks about inference (i.e., what they call “learning”).

What in the world does “empirical Bayes” mean?”

Empirical Bayes came up on Wednesday with some ML folks I was talking to, then I ran into Dave Blei this morning, who told me he’s giving a sequence of talks on empirical Bayes over the next few weeks (at Columbia and at University of Chicago). I asked him what “empirical Bayes” meant to him, because it seems to be used very fluidly in ML. He gave me a new usage, saying he used it for any model that uses data to fit parameters of a prior, including just plain old hierarchical modeling fit with sampling. Dave gave the example of ARD in Gaussian processes (aka, hierarchical models).

I would only use the term in the way described in the Wikipedia entry for “Empirical Bayes”, namely

… empirical Bayes may be viewed as an approximation to a fully Bayesian treatment of a hierarchical model wherein the parameters at the highest level of the hierarchy are set to their most likely values, instead of being integrated out.

I pinged Mark Goldstein, one of our top-notch ML postdocs, and he pretty much reeled off the Wikipedia definition. So there seems to be a lot of variation in how this is used.

Robbins on Empirical Bayes

Dave also pointed me to the following video by Herbert Robbins (yes, that Robbins, who was so far ahead of the computational statistics curve that he introduced stochastic gradient descent and multi-armed bandits in the early 1950s).

Terminology drift in ML

I’ve been a bit shocked at how many technical terms have drifted in meaning in ML. I’m not talking about people making honest mistakes or clueless mistakes, I’m talking about true drift in meaning where the ML folks will stand by their definitions.

Likelihood: I’ve seen “likelihood” used for what I’d call the data generating distribution and sometimes just as a synonym for density. Aki has this one covered in his recent post, A data model is not just a likelihood. In stats, the likelihood is defined as the function L(theta) = p(y_obs | theta)—that is, it’s a function of theta for some fixed observed data.

Causal: With the advent of LLMs, anything with an autoregressive structure is now being described as “causal” in a “past causes the future” sense. This is even being extended to arbitrary directed graphical models, which are now being called “causal” even when there’s no explicit causality being modeled. That is, you can now describe a simple regression from an observational experience as “causal” with no extra work.

Estimation: Estimation is almost always called “learning” in ML.

Parameters: These are usually called “weights” for neural networks, which I think now make up 99.9% of all work in ML. But if you tell ML folks that neural networks are parametric models, they’ll most often deny it. A statistician would confusingly call a neural network a “non-parametric model” and then tell you that means it has a lot of parameters.

Inference: In our diffusion model reading group, the ML postdocs tell me that “inference” means what I would call “posterior predictive sampling”. For example, generating output to a query from an LLM would be called “inference.”

Bias: In statistics, this usually means expected error. In ML, it’s heavily overloaded. It can be used to name just about any kind of error measure (e.g., errors in Matt Hoffman’s sampling papers). ML folks also use the term “bias” to mean the intercept in a regression (no, I’m not kidding).

Prior: This is often called an “inductive bias” in ML circles, which can include aspects of data generating distributions as well as priors.

In Bayesian statistics, a prior is the marginal distribution over parameters. In ML and informal presentations of “Bayesian statistics,” it’s just any marginal that gets plugged into Bayes’s rule. For instance, the prevalence of a disease p(disease+) is called the “prior” when I evaluate positive predictive accuracy p(disease=+ | test=+) given a testing sensitivity distribution p(test=+ | disease=+). In a k-way classification, “prior” means the marginal distribution over categories.

Bayesian: I think in ML this term is used very broadly for any situation in which there’s a prior not strictly stated as a penalty function for penalized learning (i.e., regression). I think Andrew’s down with this definition as he also uses Bayesian for anything that looks vaguely Bayesian no matter how inference is performed. For example, Empirical Bayes is just Bayes to Andrew, as is using a Laplace approximation or even a simple maximum likelihood estimate (just think of it as a one-point posterior summary!).

Regression: Note quite on topic, but I think of neural networks as just a GPU-friendly form of non-linear regression.

Uncertainty quantification: This is the primary subject of statistics, though I think the term “uncertainty quantification” is much more prevalent in engineering/signal processing than in ML. There are even journals of that title that look sort of like statistics journals.

I’d find reading ML papers easier if there was less meaning drift from well-established terminology. I’m not saying the ML folks should be up to date with Gelman’s idiomatic statistical lexicon (the concepts are fun, and it can be useful for talking to Andrew and people in his circle like the blog readers, but I wouldn’t recommend using these terms in papers without explanation.

I’m sure there are many more terms have been coined or have drifted one way or another that I’m forgetting about.

Read the whole story
sarcozona
16 hours ago
reply
Epiphyte City
Share this story
Delete

Canada's emissions reductions slowed in 2024, federal data shows

1 Comment
The latest annual account of greenhouse gas emissions shows Canada's emissions reductions slowed in 2024 to almost nothing.
Read the whole story
sarcozona
16 hours ago
reply
Thanks carney
Epiphyte City
Share this story
Delete

The Prescient Parable of ‘Radioactive Emergency’ | The Tyee

1 Share

Amid all the post-apocalyptic dreck offered on Netflix, a new Brazilian miniseries stands out as a truly frightening horror story. It’s made more so because it’s based on a real incident that affected real people almost 40 years ago, in 1987. Far from being dated, Radioactive Emergency is a parable for our own times.

Watch the trailer for Radioactive Emergency, a 2026 Netflix miniseries that closely follows the worst radiation disaster not caused by a nuclear reactor.

Following historical events closely, the series begins in a working-class, mostly Black neighbourhood in the city of Goiânia, capital of the Brazilian state of Goiás.

Two young men dig a heavy metal tank out of the rubble of an abandoned radiotherapy clinic. They take it to a local junkyard, where the owner bargains good-naturedly with them (they need money for new soccer boots) and finally buys the tank.

That starts a sequence of disasters. From the abandoned radiotherapy clinic, a few grams of cesium-137, a radioactive isotope, move to the junkyard, then across the city — and then all the way to São Paulo, one of the biggest cities in South America.

By chance, a young physicist, Márcio (Johnny Massaro), and his wife are visiting his father in Goiânia. One morning, the next-door neighbour, a doctor, calls Márcio and invites him to look at some perplexing new cases in the local hospital. That leads in turn to a quietly frantic search across the city, tracking the path of the cesium-137 and the people, places and objects it’s contaminated.

Márcio and Dr. Orenstein (Paulo Gorgulho), the head of the national agency for nuclear radiation, push the search while trying to explain to the radiation victims, health-care workers, and politicians that they’re dealing with a threat never seen before.

Nothing like Chernobyl

The Chernobyl Power Plant in the Soviet Union, later Ukraine, disastrously exploded just the year before and everyone’s heard something about it. But that was an enormous nuclear reactor complex, and this radioactive isotope in Brazil is just a handful of powder that glows blue in the dark.

The longer the search goes on, the more contamination the searchers find. The canister containing the cesium-137 sits in the living room of the junkyard owner for days before his wife takes it to a hospital — where it ends up sitting on a chair in a courtyard.

Márcio, Orenstein and their technical helpers understand how serious the problem is, but no one wants to be told they have to leave their homes and live in tents in the municipal stadium until they can be tested for radioactive contamination.

The state governor is alarmed at what the contamination will do to the local economy and his own reputation. It’s not explicit, but Brazilian viewers would know very well that the 21-year rule of a military junta had ended just two years before. The contamination makes civilian government look incompetent.

The contamination also behaves very much like an infectious disease — COVID-19, for example. As Márcio and his colleagues trace radiation across the city, they also frighten everyone they meet.

We, the viewers, are already frightened to the edge of horror. We watch people laughing around a kitchen table as they play with the powder. They carry the canister onto a crowded bus. We watch a little girl eat a snack with powder on her hands.

As in a nightmare, we see awful things but we can do nothing.

Actor Johnny Massaro wears a light blue button-down shirt over a white T-shirt. He has dark curly hair and he is standing in the centre of the frame among bleachers, frowning. Around him are people standing or seated, looking concerned. Brazilian actor Johnny Massaro stars in Radioactive Emergency. He plays Márcio, a young physicist who teams up with a local doctor to track down the mysterious source of radiation poisoning in the city of Goiânia, Brazil. The show is based on real events that occurred in 1987. Still from Radioactive Emergency trailer.

‘Calma! Calma!’

In a feat of detective work, Márcio finds the bus that carried the canister to the hospital, climbs aboard with his Geiger counter, and learns the bus is very much contaminated, and so are all the passengers.

They panic when he tells them they must be screened at the stadium. All Márcio can do is shout “Calma! Calma!” In other words, take it easy. He is talking to himself as much as to the passengers.

The series pivots around a key issue: the experts don’t know how to talk to the people, and can’t even agree among themselves. A Soviet radiation expert, brought in to help, only quarrels with the Brazilians.

Meanwhile, healthy relatives of those exposed to radiation find themselves forbidden to go home or to work. They are stigmatized and shunned by friends and neighbours.

Problems like this have occurred in every recent serious outbreak of an emerging disease.

In the West Africa Ebola outbreak, young men in one town threw rocks at health-care workers, thinking they were bringing the disease.

In another case, villagers murdered eight health-care workers, journalists and government officials who had come to warn them about Ebola.

Ordinary people in Liberia, Guinea and Sierra Leone had no particular reason to trust their governments or foreign experts; they associated hospitals with death, and preferred seeking help from local healers (who died treating Ebola cases, and whose elaborate, crowded funerals spread the disease even further).

One of the radiation cases in Goiânia is hospitalized but escapes — again reminding us of more recent Ebola outbreaks — and is nearly shot before being returned. When the local hospital can’t help the most serious cases, Dr. Orenstein pulls strings to get them transferred to a naval hospital in Rio de Janeiro — but the admiral in charge won’t provide enough beds for all, so the local doctors face an ethical dilemma in choosing who can go.

The disaster also exposes sloppy governance and lack of regulation. Embarrassing facts emerge from a hearing. The radiotherapy clinic was treating cancer patients for free, and had run out of money.

Quarrels arose about the clinic’s lease and money owing to local businesses, and somehow the cesium-137 canister was never recorded as being part of the clinic’s equipment.

And no one had established any guidelines for dealing with radioactive contamination on such a scale.

Men in military uniforms and berets stand in the grass of a soccer stadium as people arrive by ambulance at night. In a nightmarish scene from Radioactive Emergency, people are taken to the local stadium for radiation testing. Still from Radioactive Emergency trailer.

A critique of Bolsonaro

It’s likely that Radioactive Emergency is a critique of the Brazilian response to COVID-19 under the reactionary rule of then-president Jair Bolsonaro. He consistently minimized the pandemic, which caused an estimated 38 million COVID-19 cases and over 700,000 deaths.

image atom
Why the P1 Variant Poses So Great a Threat to BC
read more

The series succeeds because it mostly avoids the easy thrills of melodrama. The working-class people who are exposed to cesium-137 are simply trying to make a living in a rundown community like those I saw in 1950s Mexico when I was a boy.

The doctors and experts feel trapped between their knowledge and their ignorance. The politicians are appalled at what they must do: isolate thousands of people, hospitalize hundreds, demolish homes — and help to create policies to ensure nothing like this ever happens again.

By understating the disaster, the series lets us draw our own conclusions and compare our own communities with Goiânia.

One of its most powerful moments, near the end, shows a woman who has lost her child to radiation poisoning. Returning to her old house to watch it being demolished, she stands very still, her face a blank. We know what she is feeling, because we feel it ourselves.

‘Radioactive Emergency’ is now streaming on Netflix.  [Tyee]

Read the whole story
sarcozona
16 hours ago
reply
Epiphyte City
Share this story
Delete
Next Page of Stories