Okay, I'll bite. I'm not particularly well-versed in AI issues, but this article is of the end-is-nigh variety and HN tends to be a technologically optimistic community, so I'm hoping someone can debunk this and give us reason to be optimistic rather than terrified of our future as human batteries in The Matrix.
Especially given how we reproduce without bounds, consume all the resources available, and rather than dying we simply find other resources to consume to maintain our steadily growing population.
For the hundred thousandth time, a generation of humans will be confronted with the necessity of incorporating into their society a group of beings which they created and which they love, hate, fear, trust, and, most of all, barely understand. For a million years, that group was "their own biological children," but over the next few decades, that group might also come to include "their mental children: AIs."
In other words, we're damn good at this. We'll make it. :-)
not really, because AI isn't really our species, and given all of the other creation we've been doing - having to do with other animals, taking care of them, nurturing them, and eventually supporting 150 billion of them dying each year
plenty of times when we had a chance to incorporate what science calls "homo something" (neanderthals and the like) members of our family, we've exterminated them due to looking weirdly and behaving differently from us - just because we had some more superior characteristics
we couldn't respect our fellow evolutionary brothers, how can we respect something that was a pure materialistic creation
it took centuries for slavery to be abolished, there's still parts of the world where it's accepted
it'll take a lot for us to shift from seeing that cleaning lady robot that we chat with every single day, as something that is equivalent to us in spirit and that it deserves to have a right to freedom of movement, right to avoid pain (physical or psychological).
people will consider AI as pure machines, while on the other hand they will accept their biological machine as a miracle, both machines are giving spark to intelligence but one will be more worth than another
damn, we have the needs for water, food, shelter, companionship, procreation, we understand the world around us, otherwise we wouldn't survive, the same is true for almost every other non-human animal, and we have no problem with their massive deaths
i really hope that our morals, mentality will get the same exponential shift that technology brings, the past evidence seems to point to bad stuff, but I guess we're just on the knee of the curve :D
I do not think we get AI in the near future. What we have at the moment are more like 'Intelligence Amplification' tools for humans to use/direct.
But there are 3 concerns:
(1) Maybe, although the chance is small, it is so bad that prob x payoff is large enough that we ought to worry/think about it?
(2) If it happens that jobs lost to AI are not replaced by other work (hard to say) then we have unemployment & social problems.
(3) The current deep learning breakthroughs in image recognition, speech recognition, etc, make it much easier to process all that surveillance data that is being gathered. When surveillance tools, and e.g. drones as well, can be controlled by small numbers of humans, you should be worried.
Historically, governments have usually required the support of a fair fraction of their populace in order to stay in power. Ordering soldiers to shoot their fellow citizens has always been risky for governments. Soon that might not be the case.
In the past, a nation's power depended on its level of technology, its capital equipment, and the number and skills of its population. There was an incentive to have a skilled, well fed, and content populace.
Maybe a large part of the populace will no longer be 'needed'?
I guess you could counter points 2 & 3 by saying "Yes, but our democratic institutions are strong and our politicians are caring and intelligent - our societies will deal with these changes."
For myself, (3) scares me. You should be afraid of ending up like the Scottish Highlanders turfed out of their homes by Chiefs who replaced them with sheep, or like the cart-horses who were replaced by the internal combustion engine (and were shot). There is no need to fear an AI taking over, it is humans you need to be afraid of.
Anyone?