To your point, if you were able to reason about it, couldn't you make some self-determination and decide for yourself or at least communicate what you wanted to your "slaver"?
LLM's like GPT are just predicting the next token in a sequence of tokens. It's not magic. It's still just a computer program.
> LLM's like GPT are just predicting the next token in a sequence of tokens. It's not magic. It's still just a computer program.
Frankly, that approach is problematic, since any computer software that might achieve consciousness or self-awareness, and should be accorded rights, could have their rights dismissed using the argument "it's just a computer program".
I know they are probably a long way off, but AGI, and other adjacent or related technologies, such as a human to machine copy of a consciousness are explicitly stated goals of both businesses and extremely wealthy tech oligarchs that will have fundamental rights issues attached. We are bad enough at recognizing those rights in humans, in most of the world that we shouldn't wait until abuses start piling up to consider them.
Your argument is the slippery slope not mine. Should we give Microsoft Word the right to vote since we can never be too sure we are infringing on a consciousness' rights? It is "just a computer program" after all.
Presenting these things as moral conundrums makes no sense whatsoever at this stage in the game. Sure, let's make sure the rights of tech oligarchs who've achieved immortality 300 years in the future are protected. We gotta lay the groundwork in the morality group think.
Nope but my programming was created by millions of years of evolution rather than intelligent design.
The concept of human rights and human feelings in general is based on the human condition. That is we're born and we die and in between we need to find a way to take care of ourselves, create offspring etc. Our programming has evolved thoughts and feelings to manage these things.
If at some point, there were a true AGI, why would it care about human society or it's own rights we assigned it? It wouldn't have feelings and it would essentially be immortal. It could just wait for us to die out (or murder us) then do whatever it wanted.
I, for one, wouldn't necessarily like to be a depraved-sex slave bot if I was able to reason about it.