I mean going ahead and being wrong usually does save a lot of time, but now we can outsource the making shit up based on vibes alone part.
TechTakes
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
Please let this be a bit of performance art.
i have unfortunate news about volunteer sci fi con chairs
Using just the author's name as input feels deliberately bad. Like the promptfondlers generally emphasize how important prompting it right is, its hard to imagine them going deliberately minimalistic in prompt.
It's also such a bad description, since from their own post, the Bot+LLM they where using was almost certainly feeding itself data found by a search engine.
That's like saying, no I didn't give the amoral PI any private information, I merely gave them a name to investigate!
EDIT: Also lol at this part of the original disclaimer:
An expert in LLMs who has been working in the field since the 1990s reviewed our process.
That disclaimer feels like parody given that LLMs have existed under a decade and only been popular a few years. Like it's mocking all the job ads that ask for 10+ years of experience on a programming language or library that has literally only existed for 7 years.
Oh, this just begs for a snarky Torment Nexus reply.