this post was submitted on 02 May 2025
114 points (96.7% liked)
Funny
9420 readers
851 users here now
General rules:
- Be kind.
- All posts must make an attempt to be funny.
- Obey the general sh.itjust.works instance rules.
- No politics or political figures. There are plenty of other politics communities to choose from.
- Don't post anything grotesque or potentially illegal. Examples include pornography, gore, animal cruelty, inappropriate jokes involving kids, etc.
Exceptions may be made at the discretion of the mods.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
It's not doable because it will eat away the profit margins. /s
No /s.
Could you imagine an artifical mind actually trying to obey these? You can't even get past #1 without being aware of the infinite number of things you could do cartesian-producted with all the consequential downstream effects of those actions until the end of time.
No one ever explained why they had to obey those laws, in the 1st place....only that they had to.