this post was submitted on 27 Dec 2024
365 points (95.1% liked)
Technology
60133 readers
2487 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're - of course - right. Though I'm always a bit unsure about exactly that. We also don't attribute intelligence to books. For example an encyclopedia, or Wikipedia... That has a lot of knowledge stored, yet it is not intelligent. That makes me believe being intelligent has something to do with being able to apply knowledge, and do something with it. And outputting text is just one very limited form of interacting with the world.
And since we're using humans as a benchmark for the "general" part in AGI... Humans have several senses, they're able to interact with their environment in lots of ways, and 90% of that isn't drawing and communicating with words. That makes me wonder: Where exactly is the boundary between an encyclopedia and an intelligent entity... Is intelligence a useful metric if we exclude being able to do anything useful with it? And how much do we exclude by not factoring in parts of the environment/world?
And is there a difference between being book-smart and intelligent? Because LLMs certainly get all of their information second-hand and filtered in some way. They can't really see the world itself, smell it, touch it and manipulate something and observe the consequences... They only get a textual description of what someone did and put into words in some book or text on the internet. Is that a minor or major limitation, and do we know for sure this doesn't matter?
(Plus, I think we need to get "hallucinations" under control. That's also not 100% "intelligence", but it also cuts into actual use if that intelligence isn't reliably there.)