What is the cost of lies?

Quotes from Valery Legasov’s character in HBO’s miniseries Chernobyl (2019). An apt reminder for our current times, lies have a price.

Episode 1

What is the cost of lies? It’s not that we’ll mistake them for the truth. The real danger is that if we hear enough lies, then we no longer recognize the truth at all. What can we do then? What else is left to abandon even the hope of truth and content ourselves instead with stories? In these stories, it doesn’t matter who the heroes are. All we want to know is: “Who is to blame?” In this story, it was Anatoly Dyatlov. He was the best choice. An arrogant, unpleasant man, he ran the room that night, he gave the orders… and no friends. Or at least, not important ones. And now Dyatlov will spend the next ten years in a prison labor camp. Of course, that sentence is doubly unfair. There were far greater criminals than him at work. And as for what Dyatlov did do, the man doesn’t deserve prison. He deserves death.

Episode 5

I’ve already trod on dangerous ground. We’re on dangerous ground right now, because of our secrets and our lies. They’re practically what define us. When the truth offends, we—we lie and lie until we can no longer remember it is even there. But it is…still there. Every lie we tell incurs a debt to the truth. Sooner or later, that debt is paid. That is how…an RBMK reactor core explodes: Lies.


Swahili on the Road

Swahili on the Road

I enjoyed reading about how Swahili has spread across Africa — especially learning about the efforts of Mwalimu Julius Nyerere. I hadn’t realized how Swahili ended up in South Africa. Nyerere supported the liberation of other African countries both in words and in deeds.

Tanzania became a staunch supporter of the liberation movements developing in southern Africa, donating land in its central Dodoma region as a training camp for the armed wing of South Africa’s African National Congress. Nyerere broke diplomatic relations with Britain in 1965 over the latter’s handling of Rhodesia’s Unilateral Declaration of Independence, and offered Dar es Salaam as headquarters for the African Liberation Committee, an organ of the OAU that channelled support for independence movements around the continent. Because of this, thousands of people from southern Africa learned Swahili during stays in Tanzania, and many more looked upon the language as that of a steady ally.

I also loved his response when challenged about whether African countries were ready for independence:

“If you come into my house and steal my jacket, don’t then ask me whether I am ready for my jacket. The jacket was mine, you had no right at all to take it from me … I may not look as smart in it as you look in it, but it’s mine.”

via HN


Every augmentation is also an amputation

In Remarks on AI from NZ, Neal Stephenson captures something I’ve been grappling with as I learn to program: a sense that leveraging AI for certain tasks can rob us of the opportunity to learn.

Speaking of the effects of technology on individuals and society as a whole, Marshall McLuhan wrote that every augmentation is also an amputation. […] Today, quite suddenly, billions of people have access to AI systems that provide augmentations, and inflict amputations, far more substantial than anything McLuhan could have imagined. This is the main thing I worry about currently as far as AI is concerned. I follow conversations among professional educators who all report the same phenomenon, which is that their students use ChatGPT for everything, and in consequence learn nothing.

The post also reflects on other forms of intelligence from a perspective that was new to me.

via Simon Willison


Calculators & Writing

In Calculators & Writing, Chris Coyer points to something that’s been lingering in the back of my mind when using AI for certain tasks. My partner actually brought this up to me a while ago, and at the time I thought they were wrong – but I’ve come around. The quote below gets to the heart of it:

… with writing, humans are thinking, feeling, and communicating. Computers do not do these things.

Using AI to write means you’re robbed of the thinking and feeling it takes to write, which is (most?) of the value. Communicating, I suppose, you’re doing either way, you’re just communicating something you didn’t think about or feel if your writing is generated, which is fucked.

Despite remarkable progress in AI, some tasks still demand our effortful engagement. Learning, for example, requires one “to make eye contact with the idea” because the value lies in the effort, not just the outcome.

Related:


ASI existential risk

ASI existential risk: reconsidering alignment as a goal.

A lot of smart people are worried that future AI systems could become dangerously misaligned with human values. The concern is that as these systems gain power, their goals might drift from ours–leaving humans vulnerable. Researchers in the AI safety world are working to prevent this by aligning systems with our values. At the same time, there’s another group of researchers, equally brilliant, who find the idea of a rogue AI implausible and mostly dismiss the risks.

Michael Nielsen’s essay presents a nuanced thesis: artificial superintelligence (ASI) may pose an existential threat, but not in the way current AI safety research tends to frame it. He argues that alignment research may inadvertently accelerate the path to such powerful AGI; and that while those efforts might help mitigate certain dangers, they also obscure or neglect other existential risks.

At the heart of his argument is this striking line: “Deep understanding of reality is intrinsically dual use.” Nielsen explores how this dynamic plays out as science and technology advance over time. The essay is accessible, thought-provoking, and by far the most insightful piece I’ve read on the potential dangers of frontier AI systems. Highly recommended.