this post was submitted on 11 Jul 2024
230 points (96.7% liked)

Technology

58009 readers
2984 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ThePantser@lemmy.world 16 points 2 months ago (2 children)

No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it,

Hydraulic press channel guy offended you somehow? I'm missing something here.

[–] pennomi@lemmy.world 21 points 2 months ago (3 children)

No, just an example. But if you’ve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.

[–] Emperor@feddit.uk 13 points 2 months ago (2 children)

Sometimes other bodily fluids.

[–] devfuuu@lemmy.world 6 points 2 months ago

The machines need to be oiled somehow.

[–] superminerJG@lemmy.world 1 points 2 months ago

🤨 vine boom

[–] 0x0@programming.dev 3 points 2 months ago

Either Darwin awards or assholes, most likely. Those warnings are written due to fear of lawsuit.

[–] hendrik@palaver.p3x.de 1 points 1 month ago* (last edited 1 month ago)

However this tool doesn't have any safety warnings written on it. The App they used specifically caters for use-cases like this. They advertise to use it unmorally and we have technology to tell age from pictures for like 10 years. And they deliberately chose to have their tool generate pictures of like 13 yo girls. In the tool analogy that's like selling a jigsaw that you're very well aware of, misses some well established safety standards and is likely to injure someone. And it's debatable whether it was made to cut wood anyways, or just injure people.
And the rest fits, too. No company address, located in some country where they can't be persecuted... They're well aware of the use-case of their App.

[–] Ookami38@sh.itjust.works 7 points 2 months ago (1 children)

I don't think they're offended. I think they're saying that a tool is a tool. A gun or AI are only dangerous if misused, like a hydraulic press.

We can't go around removing the tools because some people will abuse them. Any tool can kill someone.

[–] Obi@sopuli.xyz -1 points 2 months ago (1 children)

Guns have no other purpose though, they shouldn't be lumped in with the rest of that list (except hunting rifles and so on, for folks that actually need them).

[–] Ookami38@sh.itjust.works 3 points 2 months ago (1 children)

That's purposely obtuse. Of course guns have a purpose, you even listed one.

[–] Obi@sopuli.xyz 3 points 2 months ago (1 children)

Not sure why I keep trying to talk about this with Americans, my bad. You're completely right!

[–] Ookami38@sh.itjust.works -3 points 2 months ago (1 children)

Sure, get needlessly antagonistic, provoke a response, decide to run from the confrontation you caused, and I'm the childish one. Fuck outta here.

[–] beejboytyson@lemmy.world 1 points 2 months ago

Rofl you just lost