this post was submitted on 25 Aug 2024
323 points (92.4% liked)

Technology

59554 readers
3430 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hessenjunge@discuss.tchncs.de 14 points 2 months ago (2 children)

The LLMs they train on their code will only be accessible internally. They won’t leak their own intellectual property.

[–] JustJack23@slrpnk.net 4 points 2 months ago (3 children)

Will that not be more experiensive than having developers?

[–] echodot@feddit.uk 5 points 2 months ago

Yeah which is why this is a dumb statement from Amazon. But then again I don't expect C-suite managers to really understand the intricacies of their own companies.

[–] androogee@midwest.social 3 points 2 months ago

Of course not. It will be more expensive and they'll still have to pay developers to figure out what's wrong with their AI code.

[–] hessenjunge@discuss.tchncs.de 2 points 2 months ago

Possibly. It’s hard to know without seeing the numbers and assessing output quality and volume.

Also it’s not unheard of that some bigwig wastes millions of company €€ for some project they fancy. (Billions if they happen to be Elon)

[–] prole@lemmy.blahaj.zone 2 points 2 months ago* (last edited 2 months ago)

If only we had an overarching structure that everyone in society has agreed exists for the purposes of enforcing laws and regulating things. Something that governs people living in a region... Maybe then they could be compelled to show exactly what they're using, and what those models are being trained with.

Oh well.