joering2 6 days ago

The AI tools should be helping more than hurting. But take my example: I am in 3 year long litigation with soon to be ex-wife, she recently fired her attorneys and for 2 weeks used chatGPT to write very well worded, very strong and very logically appealing motions practically almost destroying my attorney on multiple occasions and he had to work overtime costing me extra $80,000 in litigation costs. And finally once we got in front of the judge, the ex could not combine two logical sentences together. The paper can defend itself on its face but it also turned out that not a single citation she cited had anything to do with the case at hand, which chatGPT is known for in legal circles. She admit using the tool and only got a verbal reprimand. The judge told majority of that "work" was legal and she cannot stop her from exercising her first amendment right, be it written by AI she had to form questions, edit responses, etc. And I wasn't able to recover a single dime since on its face her motions did make sense, although judge denied majority of her ridiculous pleadings.

Its really frightening! Its like handling over the smartest brain possible to someone who is dumb, but also giving them very simple GUI that they actually can operate and ask good enough questions/prompts to get smart answers. Once the public at large figure this one out, I can only imagine courts being flooded with all kinds of absurd pleadings. Being the judge in the near future will most likely be the least wanted job.

1
Vegenoid 2 days ago

Up next: the judges use LLMs to evaluate arguments.