Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Meh. Add it to the pile. The number of world ending risks that we could be worried about at this point are piling up and AI exterminating us is far from the top concern, especially when AI may be critical to solving many of the other problems that are.

Wrong about nuclear proliferation and MAD game theory? Human extinction. Wrong about plasticizers and other endocrine disruptors, leading to a Children of Men scenario? Human extinction. Wrong about the risk of asteroid impact? Human extinction. Climate change? Human extinction. Gain of function zombie virus? Human extinction. Malignant AGI? ehh... whatever, we get it.

It's like the risk of driving: yeah it's one of the leading causes of death but what are we going to do, stay inside our suburban bubbles all our lives, too afraid to cross a stroad? Except with AI this is all still completely theoretical.



I think almost none of the scenarios you have named outside of the asteroid & the AGI would result in complete human extinction, potentially a very bad MAD breakdown could also lead to this but the research here is legitimately mixed.


You disagreed with me, but at least you acknowledged there was risk, even though we could disagree about the odd or potential impact. Yet, folks like Yann LeCun ridiculed anyone who thought there was a risk AI could endanger us or harm our way of life. What do we know about experts who are always confident (usually on TV) about things that haven't happened yet?


Yes, and all of those (including AI) are not even human extinction events.

- Nuclear war: Northern Hemisphere is pretty fucked. But life goes one elsewhere.

- Plasticisers: We have enough science to pretty much do what we like with fertility these days. So it's catastrophic but not extinction.

- Climate Change: Life gets hard, but we can build livable habitats in space... pretty sure we can manage a harsh earth climate. Not extinction.

- Deadly virus: Wouldn't be the first time, and we're still here.

- Astroid impact: Again, ALL human life globally? Some how birds survived the meteor that killed the dinosaurs, I'm sure we'd find a way.

- Complete Made up evil AI: Well we'd torch the sky, be turned into batteries but then be freed by Keanu Reeves.. or a Time traveling John Connor. (sounds like I'm being ridiculous, but ask a stupid question...)


You're taking these things too lightly. It's true that most of these things are unlikely to kill all humans directly, but with most of them, civilizational collapse is definitely on the table, and that can ultimately lead to human extinction.

For example: Yes, we could probably build livable habitats in space (though we don't really have proof of that). But how many, for how many people, and what kind of external support systems do they require? These questions put stresses on society that prevents space habitats from working out in the long term.


No, all of those things are bad. But they aren't end of human species events. And GPT4 isn't even close to on the list.


> - Astroid impact: Again, ALL human life globally? Some how birds survived the meteor that killed the dinosaurs, I'm sure we'd find a way.

I agree with many of these but we'd plausibly be toast in this scenario.


Yeah that one I agree is the most devastating. Agree, plausibly toast.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: