Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> But only metaphorically the equivalent, as the maximum downside is much worse than that.

Maybe I'm a glass-half-full sort of guy, but everyone dying because we failed to reverse man-made climate change doesn't seem strictly better than everyone dying due to rogue AI



Everyone dying from a rogue AI would be stupid and embarrasing: we used resources that would've been better used fighting climate change, but ended up being killed by an hallucinating paperclip maximizer that came from said resources.

Stupid squared: We die because we gave the AI the order of reverting climate change xD.


Given the assumption that climate change would kill literally everyone, I would agree.

But also: I think it extremely unlikely for climate change to do that, even if extreme enough to lead to socioeconomic collapse and a maximal nuclear war.

Also also, I think there are plenty of "not literally everyone" risks from AI that will prevent us from getting to the "really literally everyone" scenarios.

So I kinda agree with you anyway — the doomers thinking I'm unreasonably optimistic, e/acc types think I'm unreasonably pessimistic.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: