this post was submitted on 17 Dec 2024
567 points (92.4% liked)

memes

10661 readers
1674 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] ILikeBoobies@lemmy.ca 8 points 4 days ago (2 children)

I gave it a math problem to illustrate this and it got it wrong

If it can’t do that imagine adding nuance

[–] kopasz7@sh.itjust.works 10 points 4 days ago (1 children)

Well, math is not really a language problem, so it's understandable LLMs struggle with it more.

[–] ILikeBoobies@lemmy.ca 10 points 4 days ago (1 children)

But it means it’s not “thinking” as the public perceives ai

[–] kopasz7@sh.itjust.works 5 points 4 days ago (2 children)

Hmm, yeah, AI never really did think. I can't argue with that.

It's really strange now if I mentally zoom out a bit, that we have machines that are better at languange based reasoning than logic based (like math or coding).

[–] Hoimo@ani.social 1 points 3 days ago

Not really true though. Computers are still better at math. They're even pretty good at coding, if you count compiling high-level code into assembly as coding.

But in this case we built a language machine to respond to language with more language. Of course it's not going to do great at other stuff.

[–] GrammarPolice@lemmy.world -1 points 4 days ago* (last edited 4 days ago)

Ymmv i guess. I've given it many difficult calculus problems to help me through and it went well