Last week DoNotPay CEO Joshua Browder announced that the company's AI chatbot would represent a defendant in a U.S. court, marking the first use of artificial intelligence for this purpose. Now the experiment has been cancelled, with Browder stating he's received objections from multiple state bar associations.
"Bad news: after receiving threats from State Bar prosecutors, it seems likely they will put me in jail for 6 months if I follow through with bringing a robot lawyer into a physical courtroom," Browder tweeted on Thursday. "DoNotPay is postponing our court case and sticking to consumer rights."
Tweet may have been deleted (opens in a new tab)
The plan had been to use DoNotPay's AI in a speeding case scheduled to be heard on Feb. 22. The chatbot would run on a smartphone, listening to what was being said in court before providing instructions to the anonymous defendant via an earpiece.
However, numerous state prosecutors did not respond well to DoNotPay's proposed stunt, writing to Browder to warn him that it would potentially be breaking the law. Specifically, Browder may be prosecuted for unauthorised practice of law, a crime which could put him behind bars for half a year in some states.
In light of this, Browder opted to pull the plug on the whole experiment rather than risk jail time.
"Even if it wouldn't happen, the threat of criminal charges was enough to give it up," Browder told NPR.
It's probably for the best. DoNotPay's legal chatbot was developed using OpenAI’s ChatGPT which, while no doubt sophisticated for an AI chatbot, still has significant flaws. Relying on it for anything of importance isn't the best idea at this stage.
This near miss with the wrong side of the law also appears to have DoNotPay reassessing its products. Previously the company offered computer-generated legal documents for a wide variety of issues, covering everything from child support payments to annulling a marriage. Now Browder has announced that DoNotPay will only deal with cases regarding consumer rights law going forward, removing all other services "effective immediately."
"Unlike courtroom drama, [consumer rights] cases can be handled online, are simple and are underserved," Browder tweeted. "I have realized that non-consumer rights legal products (e.g defamation demand letters, divorce agreements and others), which have very little usage, are a distraction."
The CEO also stated that employees are currently working 18-hour days to improve DoNotPay's user experience, which doesn't seem like something to boast about.
Though DoNotPay's AI experiment would have applied AI to a new area, it wouldn't have been the first use of artificial intelligence in a U.S. courtroom. States such as New York and California have previously used the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) AI tool to assess whether someone is likely to reoffend, taking it into account when determining bail.
Unfortunately, even this AI software is flawed. A 2016 study by ProPublica found COMPAS is more likely to falsely score Black defendants as higher risk, while also falsely marking white defendants as lower risk.
Artificial intelligence may seem like an exciting technology with many useful purposes. But some things are still best left to actual humans.
from Mashable https://ift.tt/Kuphkag
No comments:
Post a Comment