this post was submitted on 15 Jul 2024
62 points (97.0% liked)

Technology

57448 readers
4597 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] conciselyverbose@sh.itjust.works 10 points 1 month ago (1 children)

TLDR: he thinks the techniques are fine and you can just brute force them for the foreseeable future.

[–] Zrybew@lemmy.world 1 points 1 month ago (2 children)

Yeah... Why does it sound dumb in so many level?

[–] Voroxpete@sh.itjust.works 4 points 1 month ago (1 children)

Because he's a salesman, and he's selling you bullshit.

What the experts are now saying is that it looks like the LLM approach to AI will require exponentially larger amounts of training data (and data processing) to achieve linear growth. Next generation AI models will cost ten times as much to train, and the generation after that will cost ten times as much again.

The whole thing is a giant con. Kevin is just trying to keep investor confidence floating for a little longer.

And the harder the sell, the worse the product.

lol I honestly needed to open the article to parse the title. That's why I posted.

But I'm definitely of the belief that you need a hell of a lot more architecture than they have to go meaningfully further. Humans are a hell of a lot more complicated than a bit like of neurons.