this post was submitted on 12 Jul 2024
122 points (100.0% liked)

TechTakes

1267 readers
120 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] sailor_sega_saturn@awful.systems 32 points 2 months ago* (last edited 2 months ago) (4 children)

Sloppy LLM programming? Never!

In completely unrelated news I've been staring at this spinner icon for the past five minutes after asking an LLM to output nothing at all:

[–] self@awful.systems 22 points 2 months ago

same energy as “your request could not be processed due to the following error: Success”

[–] earthquake@lemm.ee 19 points 2 months ago (1 children)

What are the chances that the front end was not programmed to handle the LLM returning an empty string?

[–] sailor_sega_saturn@awful.systems 16 points 2 months ago

Quite likely yeah. There's no way they don't have a timeout on the backend.

[–] dgerard@awful.systems 10 points 2 months ago (1 children)

boooo Gemini now replies "I'm just a language model, so I can't help you with that."

[–] froztbyte@awful.systems 9 points 2 months ago (1 children)

"what would a reply with no text look like?" or similar?

[–] dgerard@awful.systems 8 points 2 months ago (1 children)

what would a reply with no text look like?

nah it just described what an empty reply might look like in a messaging app

they seem to have done quite well at making Gemini do mundane responses

[–] froztbyte@awful.systems 8 points 2 months ago

that's a hilarious response (from it). perfectly understand how it got there, and even more laughable