Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am pretty sure that the development of a consciousness in coming LLMs is unavoidable. A consciousness is just another useful abstraction for making precise predictions about the world.

The big question will be: How do we treat machines that have a consciousness? Is consciousness in itself worth protecting, or is it the human attributes (being able to feel pain and an evolutionary priming towards survival) that should be granted this special status. This is going to be a fun discussion.



> A consciousness is just another useful abstraction for making precise predictions about the world

That's an extremely bold claim. I think consciousness is greater than the sum of its parts, not just a "useful abstraction".


I'm sure this will become a bigger debate in the future, but I absolutely believe consciousness itself is worth protecting.

We ourselves have just sprung into existence. I should hope that as consciousness of any kind comes online, whichever world they may mind themselves in (earthly or otherwise), there is a compassion for the existence of the other.

I often feel fortunate that the extent of human suffering is a relatively short lifespan. If I were to exist in a state of suffering that didn't have such a fixed expiration, well that would be hell.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: