this post was submitted on 20 Oct 2024
66 points (95.8% liked)

Futurology

1798 readers
41 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SinAdjetivos@beehaw.org 3 points 1 month ago* (last edited 1 month ago)

I think you're conflating formal and informal logic. Programmers are excellent at defining a formal logic system which the computer follows, but the computer itself isn't particularly "logical".

What you describe as:

Action A is legal. Action B isn't. Doing X + Y + Z constitutes action A and so on.

Is a particularly nasty form of logic called abstract reasoning. Biological brains are very good at that! Computers a lot less so...

(Using a test designed to measure that)[https://arxiv.org/abs/1911.01547] humans average ~80% accuracy. The current best algorithm (last I checked...) has a 31% accuracy. (LLMs can get up to ~17% accuracy.)[https://arxiv.org/pdf/2403.11793] (With the addition of some prompt engineering and other fancy tricks). So they are technically capable... Just really bad at it...

Now law ismarketed as a very logical profession but, at least Western, modern law is more akin to combatative theater. The law as written serves as the base worldbuilding and case law serving as addition canon. The goal of law is to put on a performance with the goal of tricking the audience (typically judge, jury, opposing legal) that it is far more logical and internally consistent than it actually is.

That is essentially what LLMs are designed to do. Take some giant corpus of knowledge and return some permutation of it that maximizes the "believability" based on the input prompt. And it can do so with a shocking amount of internal logic and creativity. So it shouldn't be shocking that they're capable of passing bar exams, but that should not be conflated with them being rational, logical, fair, just, or accurate.

And neither should the law. Friendly reminder to fuck the police and the corrupt legal system they enforce.