this post was submitted on 02 May 2025
114 points (96.7% liked)

Funny

9420 readers
851 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Maiq@lemy.lol 17 points 4 days ago (3 children)
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

[–] gressen@lemm.ee 15 points 4 days ago (1 children)

It's not doable because it will eat away the profit margins. /s

[–] 30p87 2 points 3 days ago
[–] xia@lemmy.sdf.org 8 points 4 days ago

Could you imagine an artifical mind actually trying to obey these? You can't even get past #1 without being aware of the infinite number of things you could do cartesian-producted with all the consequential downstream effects of those actions until the end of time.

[–] Archangel1313@lemm.ee 5 points 4 days ago

No one ever explained why they had to obey those laws, in the 1st place....only that they had to.