ebu

joined 8 months ago
[–] ebu@awful.systems 1 points 4 months ago (23 children)

a thought on this specifically:

Google Cloud Chief Evangelist Richard Seroter said he believes the desire to use tools like Gemini for Google Workspace is pushing organizations to do the type of data management work they might have been sluggish about in the past.

“If you don’t have your data house in order, AI is going to be less valuable than it would be if it was,” he said.

we're right back to "you're holding it wrong" again, i see

i'm definitely imagining Google re-whipping up their "Big Data" sales pitches in response to Gemini being borked or useless. "oh, see your problem is that you haven't modernized and empowered yourself by dumping all your databases into a (our) cloud native synergistic Data Sea, available for only $1.99/GB"

[–] ebu@awful.systems 1 points 5 months ago

The point is that even if the chances of [extinction by AGI] are extremely slim

the chances are zero. i don't buy into the idea that the "probability" of some made-up cataclysmic event is worth thinking about as any other number because technically you can't guarantee that a unicorn won't fart AGI into existence which in turn starts converting our bodies into office equipment

It's kind of like with the trinity nuclear test. Scientists were almost 100% confident that it wont cause a chain reaction that sets the entire atmosphere on fire

if you had done just a little bit of googling instead of repeating something you heard off of Oppenheimer, you would know this was basically never put forward as serious possibility (archive link)

which is actually a fitting parallel for "AGI", now that i think about it

EDIT: Alright, well this community was a mistake..

if you're going to walk in here and diarrhea AGI Great Filter sci-fi nonsense onto the floor, don't be surprised if no one decides to take you seriously

...okay it's bad form but i had to peek at your bio

Sharing my honest beliefs, welcoming constructive debates, and embracing the potential for evolving viewpoints. Independent thinker navigating through conversations without allegiance to any particular side.

seriously do all y'all like. come out of a factory or something

[–] ebu@awful.systems 12 points 5 months ago* (last edited 5 months ago)

You're implicitly accepting that eventually AI will be better than you once it gets "good enough". [...] Only no, that's not how it's likely to go.

wait hold on. hold on for just a moment, and this is important:

Only no, that's not how it's likely to go.

i regret to inform you that thinking there's even a possibility of an LLM being better than people is actively buying into the sci-fi narrative

well, except maybe generating bullshit at breakneck speeds. so as long as we aren't living in a society based on bullshit we should be goo--... oh fuck

[–] ebu@awful.systems 25 points 5 months ago

good longpost, i approve

honestly i wouldn't be surprised if some AI companies weren't cheating at AI metrics with little classically-programmed, find-and-replace programs. if for no other reason than i think the idea of some programmer somewhere being paid to browse twitter on behalf of OpenAI and manually program exceptions for "how many months does it take 9 women to make 1 baby" is hilarious

[–] ebu@awful.systems 1 points 5 months ago

long awaited and much needed. i bestow upon you both the highest honor i can reward: a place in my bookmarks bar

[–] ebu@awful.systems 0 points 5 months ago

data scientists can have little an AI doomerism, as a treat

[–] ebu@awful.systems 2 points 5 months ago* (last edited 5 months ago) (3 children)

never read this one before. neat story, even if it is not much more than The Lorax, but psychedelic-flavored.

unprompted personal review (spoilers)

it makes sense that the point-of-view character is insulated / isolated from the harm they're doing. my main gripe is that in doing so, the actual problems of the hypothetical psychedelic healthcare industry (manufactured addiction, orientalism and psychedelic colonization, inequality of access, in addition to all of the vile stuff the real healthcare industry already does) wind up left barely stated or only implied. i was waiting for the other shoe to drop; for Learie to, say, receive a letter from a family member of a patient who died on the bed due to being unattended to, a result of stretching too few staff too thin over too many patients, et cetera. something that would pop the bubble that she built around herself and tie the themes of the story together.

instead it feels like she built the bubble and stays in the bubble. she's sad her cool business idea outgrew her, that the fifty million dollars she got as a severance package doesn't fill the hole in her heart she got by helping people directly. which is neat and all, but, like. what about all the uninsured and poor Black people who never got to even try to see if psychedelics could help? what about the Native Americans who watched their spiritual medicine, for which they were (and still are) punished heavily for using, get used to make Learie's millions, for which they will never see a penny? what about your overworked staff, Learie!?

from a persuasive and political perspective, to me it seems the non-sequitur ending leaves the entire story up for ideological grabs. think it sounds like capitalism is bad? sure, go for it. think the problem is that we need to do capitalism, But Better™? sure, go for it! hell, that's basically the author's own conclusion:

But what we really need are psychedelic models for business - business that defines new standards for integrity, equity and ethics; business reimagined with a technicolor glow.

sorry, but a can of glow-in-the-dark paint over the same old exploitative business practices is not a solution. it's just more marketing. where is this even going?

If you feel called to share a message with the world, consider taking the course to work with David, and gain structure, fellowship with changemakers, and accountability to breathe life into your story.

a $3,000 value course for only $999! what a steal!! order now, seats are first-come first-serve!

[–] ebu@awful.systems 1 points 6 months ago (3 children)

48th percentile is basically "average lawyer".

good thing all of law is just answering multiple-choice tests

I don't need a Supreme Court lawyer to argue my parking ticket.

because judges looooove reading AI garbage and will definitely be willing to work with someone who is just repeatedly stuffing legal-sounding keywords into google docs and mashing "generate"

And if you train the LLM with specific case law and use RAG can get much better.

"guys our keyword-stuffing techniques aren't working, we need a system to stuff EVEN MORE KEYWORDS into the keyword reassembler"

In a worst case scenario if my local lawyer can use AI to generate a letter

oh i would love to read those court documents

and just quickly go through it to make sure it didn't hallucinate

wow, negative time saved! okay so your lawyer has to read and parse several paragraphs of statistical word salad, scrap 80+% of it because it's legalese-flavored gobbledygook, and then try to write around and reformat the remaining 20% into something that's syntactically and legally coherent -- you know, the thing their profession is literally on the line for. good idea

what promptfondlers continuously seem to fail to understand is that verification is the hard step. literally anyone on the planet can write a legal letter if they don't care about its quality or the ramifications of sending it to a judge in their criminal defense trial. part of being a lawyer is being able to tell actual legal arguments from bullshit, and when you hire an attorney, that is the skill you are paying for. not how many paragraphs of bullshit they can spit out per minute

they can process more clients, offer faster service and cheaper prices. Maybe not a revolution but still a win.

"but the line is going up!! see?! sure we're constantly losing cases and/or getting them thrown out because we're spamming documents full of nonsense at the court clerk, but we're doing it so quickly!!"

[–] ebu@awful.systems 0 points 6 months ago (5 children)

[...W]hen examining only those who passed the exam (i.e. licensed or license-pending attorneys), GPT-4’s performance is estimated to drop to 48th percentile overall, and 15th percentile on essays.

officially Not The Worst™, so clearly AI is going to take over law and governments any day now

also. what the hell is going on in that other reply thread. just a parade of people incorrecting each other going "LLM's don't work like [bad analogy], they work like [even worse analogy]". did we hit too many buzzwords?

view more: ‹ prev next ›