Monday, February 3, 2025

Your Worry is the Real Apocalypse (the x-risk basilisk)


I. People are really worried about a lot of things these days. 

I can't blame them. There are x-risks everywhere you look. Climate change. Nuclear war. Societal collapse. And let's not forget AI! AI, AI, Ayyyyy Eyyyyyeeeeeee. 

Maybe you're worried too. I don't blame you for worrying, per se, in school they taught us about the James-Lange theory of emotion, which shitty tl;dr emotions are not rational and (a billion caveats here) emotions are not a choice.

To the extent that you do fancy yourself rational, I pose this question to you: What the fuck is your worry doing to help the situation? I know someone who is unemployed, ahem, too sick to work, and he wakes up every morning, chugs coffee, and goes on Substack. (Fuck Substack, but that's another post for another day). He can debate you about the risks of AI annihilating all of humanity before he takes his morning shit. AI alignment orthogonality thesis foom blah blah blah SHUT UP

I've seen the greatest minds of our generation doomscroll themselves into analysis paralysis, wasting their lives fighting over whether the timeline has shortened to 5 years or 15 years, making spreadsheets calculating P(doom), writing stupid long blog posts[1] about existential risk. 

Let's put it in terms you autists can understand:

P(apocalypse happens) * P(your worry actually helped) * amount of lives saved < P(your worry ruined your life) * # of people worried,

observing that P(your worry ruined your life) approaches 100% the deeper you are in the rationality cult - yes, cult - 

and noting the people whose lives you personally ruined by spreading this AI risk brainworms to.


II. It wasn't about AI risk all along, was it?

If you actually believed AI was going to take over—if you really thought this was the last decade before the endgame—you wouldn’t be sitting in a Discord server debating Roko’s Basilisk for the 100th time.

You’d be saying goodbye to your loved ones.
You’d be doing something meaningful while you still had time.

But instead, you waste your life pre-living the apocalypse

B-b-but... AI might take my job! My girlfriend might replace me with AI! Sure, you have clocked correctly that we live in a society where your value is your output. Capitalism, whatever. Want some brownie points? Face the fact that the fear of AI is existential. It's the fear that AI is better than you. It's going to make you irrelevant. That AI is going to replace your "human" ability to make art, to have sex, to chat and have meaningful relationships. Yeah, it feels bad. But be honest, between all the alignment papers, how much sex were you having anyway?

Is it really about saving humanity, or saving face? Narcissism isn’t about thinking you’re great, but about thinking your thoughts matter more than they do - believing that if you just analyze things hard enough, if you just worry the right way, if you just anticipate every possible outcome—you can control what happens. 

Spoiler: you can't.


III. Annihilation Anxiety

Noooooo---- it's not about jobs, it's about annihilation!

Annihilation - as in physical death - is what you mean. You know we're all going to die eventually, right? (I guess the cryo people would beg to disagree). But cryo isn't here yet. It's also fucking stupid, but that's yet another post. You could die tomorrow in a car crash on your way to work. You're probably shortening your lifespan from an elevated A1C by eating too much sugar and sitting all day. If you're not doing cardio 5 times a week and eating your greens[2], then it's not really about physical death.

Psychoanalysis calls this "annihilation anxiety" - not about death, about about the erasure of the self, the ego. Fear that you could be wiped out, dissolved, rendered meaningless. Guess what - you already are meaningless. You live in a society where nobody cares about you modulo what you can give to them. Why are you trying to fight something that has already won?

There is no AI safety policy that will undo the fact that you spent your 20s reading alignment papers instead of touching grass.

There is no x-risk strategy that will compensate for the fact that you spent years obsessing over something you cannot control.

Even if AI kills us all in 10 years, the joke is still on you—because you wasted your last decade worrying instead of living. And if AI doesn’t take over? Then you’ve wasted your life for absolutely nothing.



[1] not unlike this one

[2] x-risk isn't certain, and this marginal effort definitely confers benefit, so by my calculation it's still worth it

No comments:

Post a Comment

What happened to math prodigy Qiaochu Yuan, has he lost his mind?

I was an alien born on the early-2000s internet. I woke up in the Rainbow Fountain on Neopets, reviveby Potion XIV. I had no mother. I had a...