AI Legal Cases Trashed in Courts
AI parking fine appeals under fire as London adjudicators uncover fake case law. Pexels

If you are using AI like ChatGPT as your legal advisor, you might need to be really careful. As AI becomes ever more integrated into everyday life, motorists are looking to the technology for help with an age old nuisance: parking fines.

These online tools powered by generative AI are being used by drivers to make up appeals against penalty charge notices. The appeal letters these systems produce indeed do sound polished and persuasive, and many motorists welcomed them as a means of levelling the playing field against councils and private parking operators.

But a new annual report from the chief adjudicator for London's traffic appeals service has brought this trend into a big conundrum, revealing that some AI made defences have been thrown out due to fabricated case law and so called 'phantom evidence'. This exposes both the pros and cons of applying AI to legal challenges that affect millions of drivers each year.

Using AI as a Lawyer

Many of us have used AI to understand and even generate replies to complicated legalese. AI has made a huge entry into legal and administrative tasks that once required specialist training. Tools such as DoNotPay claim to give automated assistance to people contesting everything from parking fines to small claims.

As per reports, according to its founder, the platform has helped contest hundreds of thousands of parking tickets in cities including London and New York, although independent verification of those figures is limited. But this is just one of the many AI apps out there, most people use ChatGPT for such issues as well.

However, the chief adjudicator's latest report has shown worrying examples of drivers submitting AI made appeal documents that include non-existent legal cases and arguments that cannot be substantiated. These 'hallucinations' are a recognised shortcoming of generative AI models. Basically, they can produce text that reads like accurate information but is in fact invented or misrepresented, especially when the system has not been verified against up-to-date legal databases.

Read More: How Grok Saved A Man's Life When Doctors Couldn't - Elon Musk Reacts

Read More: 'Not Kidding' : Elon Musk Warns Twitter Users on Misusing AI

How AI Appeals Landed On the Adjudicator's Desk

According to the report, drivers are using generative AI tools more and more to draft appeal submissions, often after reading online claims that such systems can help beat parking fines quickly and cheaply. So, with nearly 9.5 million penalty charge notices issued across London in the most recent reporting year, showing a rise of more than 13%, it is little surprise that many motorists are seeking help from a cheaper source than a lawyer.

Moreover, the number of appeals has also increased a lot. Almost 43,200 appeals were determined during the year as per reports, an increase of nearly 24%. Now, of those, around 45% were successful for motorists. More than half of the wins occurred because councils did not contest the appeal at all, usually due to staff shortages or administrative backlogs.

Naturally, this environment has made AI tools attractive. They can generate confident, legal sounding arguments in seconds, often quoting previous tribunal decisions or legal principles. However, the chief adjudicator, Anthony Chan, said some submissions had gone far beyond enthusiastic self representation and crossed into outright fiction. He said,

'In one case, this chap threatened to sue us in the High Court. He quoted this case to say I'm wrong.' adding, 'I thought: "This can't be right." I asked him to give me a copy of the case. I never heard from him again.'

Furthermore, in one shocking case, a driver relied on seven supposedly relevant legal precedents. Five of them did not exist. The remaining two were real but were wrongly described and did not support the argument being made. When asked to provide copies of the cases, the appellant was unable to do so. The adjudicator described the material as 'phantom evidence', adding that it appeared to be the product of AI rather than actual deception. So if you do use AI for legal situations, make sure you fact check thoroughly.