Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Of course it matters. What does it meant to punish something that isn't sentient? It's like bemoaning the fact that you can't punish the car that was driven into someone.


Depends on if you care about outcome or morality. Already now we use terms like punishment, reward and adverserial competition between neural networks.


We do have those terms. The way they are used is not relevant here.


If the entity (be it a corporation or AGI-driven corporation) learns from the punishment (or its peers learn by observation) it doesn't matter if the entity is sentient, the outcome is less of the unwanted behaviour. I didn't mean anything more complicated than that.


The corporation cannot learn. Why anthropomorphise when you can actually think about the real problem: how to incentivise the people who own and direct companies to behave well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: