Shocking: AI To Replace Judge in the US Court?

1036

Artificial intelligence is already influences your life – whether it’s your Netflix viewing preferences, your suitability for a mortgage or your compatibility with a prospective employer.

But are you ready to allow AI to determine your guilt or innocence in a court of law?

ai-law-court-photo

Source: AP Photo/Dake Kang

AI in the court

Actually, believe or not, but in Cleveland and a growing number of other local and state courts such as in Arizona, Kentucky and Alaska, judges are now guided by computer algorithms.

Those digital machines help to decide whether criminal defendants can return to everyday life, or remain locked up awaiting trial.

How do AI-judges work?

Currently, courts use the special algorithms, in other words AI, to determine a defendant’s “risk”, which ranges from the probability that an individual will commit another crime to the likelihood a defendant will appear for his or her court date.

These algorithmic outputs also make decisions about bail, sentencing, and parole. Each tool aspires to improve the accuracy of human decision-making that allows for a better allocation of finite resources.

At first, AI algorithms scour through large sets of courthouse data to search for associations it predicts how individual defendants might behave.

Defendants who receive low scores are recommended for release under court supervision.

Who’s responsible?

Impressed? That’s not all, folks. Here’s the main problem with all these future technologies.

Typically, government agencies do not write their own algorithms; they buy them from private businesses.

This often means only the owners and developers can see how the software makes decisions.

Currently, there is no federal law that sets standards or requires the inspection of these tools, the way the FDA does with new drugs.

Sure, the engineer could explain how the neural network was designed, what inputs were entered, and what outputs were created in a specific case.

But the main problem is that nobody can explain HOW exactly the software makes a decision, in other words decision-making process.

ai-court-judge-photo

Source: AP Photo/Dake Kang

Algorithms can be racist, too

A report on Compas from ProPublica made clear that black defendants in Broward County Florida “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism”.

Recent work by Joanna Bryson, professor of computer science at the University of Bath, highlights that even the most “sophisticated” AIs can inherit the racial and gender biases of those who create them.

Two sides of the same coin

Experts say the use of these risk assessments may be the biggest shift in courtroom decision-making since American judges began accepting social science and other expert evidence more than a century ago. Christopher Griffin, a research director at Harvard Law School’s Access to Justice Lab, calls the new digital tools “the next step in that revolution.”

Critics, however, worry that such algorithms might end up supplanting judges’ own judgment, and possibly even perpetuate biases in ostensibly neutral form.

“This is not something where you put in a ticket, push a button and it tells you what bail to give somebody,” said Judge Ronald Adrine, who presides over the Cleveland Municipal Court. Instead, he says, the algorithmic score is just one among several factors for judges to consider.

There’s also a risk that the algorithms will make judging more automatic and rote over time — and that, instead of eliminating bias, could perpetuate it under the mask of data-driven objectivity.

Research has shown that when people receive specific advisory guidelines, they tend to follow them in lieu of their own judgment, said Bernard Harcourt, a law and political science professor at Columbia.

Judge-robot-law-future-photo

Nexter’s predictions for 2019-2022:

  • “It will become “the norm” within years for algorithms to examine evidence — because they are more reliable, faster and cheaper than humans,” David Green, director of the Serious Fraud Office.
  • As for now, nobody can tell us what criterias are included in the algorithm and how it might be extended later. More likely it will also analyze your social life, behaviour and likelihood of re-offending, age, race, gender identity, browsing habits and your surround.
  • In the future Artificial Intelligence will be able to predict a crime to be committed with the help of the analyse from the social media platforms like Facebook or (that already defines suicidal behaviour).
  • More and more states and, later, countries will provide the system of AI-judging in their courts.
  • The system will become fully autonomic and will be intervened only in some critical cases.

See also:

Like this post? Let us know!
  • CoolAF (0%)
  • Cool (0%)
  • Whatever (0%)
  • Boring (0%)
  • WTF (0%)
Summary
AI To Replace Judge in the Court: Will You Trust Your Life To An Algorithm?
Title
AI To Replace Judge in the Court: Will You Trust Your Life To An Algorithm?
Description
Artificial intelligence is already influences your life – whether it’s your Netflix viewing preferences, your suitability for a mortgage or your compatibility with a prospective employer. But are you ready to allow AI to determine your guilt or innocence in a court of law?
Nexter.org
Nexter.org
Nexter.org
https://nexter.org/wp-content/uploads/2018/04/logo_nexterbg.jpg

No tags for this post.

More News from Nexter