It's interesting how we can frame "potentially automating tasks" in the most sinister conceivable way. The same argument applies to essentially all technology, like a computer.
> The same argument applies to essentially all technology, like a computer.
Why yes, it does.
Even setting aside that most AI hype: Yes, automation is in fact quite sinister if you do not go out of your way to deal with the downsides. Putting people out of a job is bad, actually.
Yes. The industrial revolution was a great boon to humanity that drastically improved quality of living and wealth. It also created horrific torment nexuses like mechanical looms into which we sent small children to get maimed.
And we absolutely could've had the former without the latter; Child labour laws handily proved it was possible, and should have been implemented far sooner.
In addition, the Industrial Revolution led to societal upheaval which took more than a century to sort out, if you agree its ever been sorted out at all.
So, if it is true we’re on the cusp of an AI Revolution, AGI, the Singularity, or anything like that, then there’s precedent to worry. It could destroy our lives and livelihoods on a timescale of decades, even if the whole world really would be over all improved in a century or two.
I'm not suggesting child labor laws are bad, I'm saying automation is good and not sinister. Automation inherently reduces labor, which can inherently lead to someone not needing to work a job that is now automated. That we want to protect people from suffering doesn't mean we should be suspicious of all new technology because we can imagine a way someone might lose a job.
It's not really interesting, it's exactly what should be expected. We've seen how corporations act, and their history and our prior experiences go on to shape our perceptions and expectations accordingly.