this post was submitted on 19 Nov 2024
1039 points (97.6% liked)

People Twitter

5258 readers
752 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] UnderpantsWeevil@lemmy.world 1 points 2 days ago* (last edited 2 days ago) (2 children)

Cool, not really what I asked. Then command ‘write an implementation of bogo sort in python 3.’

… and then it does that.

Alright, but... it did the thing. That's a feature older search engines couldn't reliably perform. The output is wonky and the conversational style is misleading. But its not materially worse than sifting through wrong answers on StackExchange or digging through a stack of physical textbooks looking for Python 3 Bogo Sort IRL.

I agree AI has annoying flaws and flubs. And it does appear we're spending vast resources doing what a marginal improvement to Google five years ago could have done better. But this is better than previous implementations of search, because it gives you discrete applicable answers rather than a collection of dubiously associated web links.

[–] Zeppo@sh.itjust.works 1 points 1 day ago (1 children)

I don’t feel like off-the-cuff summaries by AI can replace web sites and detailed articles written by knowledgeable humans. Maybe if you’re looking for a basic summary of a topic.

[–] UnderpantsWeevil@lemmy.world 1 points 1 day ago

I don’t feel like off-the-cuff summaries by AI can replace web sites and detailed articles written by knowledgeable humans

No. But that's not what a typical search result returns.

There's also no guarantee the "detailed articles" you get back are well-informed or correct. Lots of top search results are just ad copy or similar propaganda. YouTube, in particular, is rife with long winded bullshitters.

What you're looking for is a well-edited trustworthy encyclopedia, not a search engine.

[–] sp3tr4l@lemmy.zip 6 points 2 days ago* (last edited 2 days ago) (1 children)

But this is better than previous implementations of search, because it gives you discrete applicable answers rather than a collection of dubiously associated web links.

Except for when you ask it to determine if a thing exists by describing its properties, and then it says no such thing exists while providing a discrete response explaining in detail how there are things that have some, but not all of those properties...

... And then when you ask it specifically about a thing you already know about that has all those properties, it tells you about how it does exist and describes it in detail.

What is the point of a 'conversational search engine' if it cannot help you find information unless you already know about said information?!

The whole, entire point of formatting it into a conversational format is to trick people into thinking they are talking to an expert, an archivist with encyclopedaeic knowledge, who will give them accurate answers.

Yet it gatekeeps information that it does have access to but omits.

The format of providing a bunch of likely related links to a query is a format much more reminiscent of doing actual research, with no impression that you will immediately find what you want right away, that this is a tool to aide you in your research process.

This is only an improvement if you want to further unteach people how to do actual research and critical thinking.

[–] UnderpantsWeevil@lemmy.world 0 points 2 days ago (1 children)

Except for when you ask it to determine if a thing exists by describing its properties

Basic search can't answer that either. You're describing a task neither system is well equipped to accomplish.

[–] sp3tr4l@lemmy.zip 3 points 2 days ago* (last edited 2 days ago)

With basic search, it is extremely obvious that that feature does not exist.

With conversational search, the search itself gaslights you into believing it has this feature, as it understands how to syntactically parse the question, and then answers it confidently with a wrong answer.

I would much rather buy a car that cannot fly, knowing it cannot fly, than a car that literally talks to you and tells you it can fly, and sometimes manages to glide a bit, but also randomly nose dives into the ground whilst airborne.