The criminal justice system has always relied on the wishes of the victims, the remorse of the defendant and the sympathy of the judges when it comes to determining appropriate sentencing for a crime. Or at least it always has in the past. Now technology is threatening to strip the human elements from the sentencing formula and instead rely on artificial intelligence algorithms to determine the right course of action. But many defense lawyers in La Jolla warn that this is a dangerous possibility and that we must resist the urge to allow AI to drive our court systems.
While full reliance on algorithms is still not here, courts across the country already look at data from these programs to determine a defendant’s risk and how that should affect their potential bail, sentencing and parole. The software performing these calculations is created by a third-party business who keeps their algorithms secret so it can’t be stolen, but that also means that how the software makes decisions cannot be studied and evaluated for fairness by either the government or oversight groups. Both civil rights advocates and defense lawyers in Solana Beach believe that this lack of oversight is worrisome.
In fact, while race is not directly a factor in the sentencing algorithms, Propublica believes that the results given by these programs can show a distinct racial bias where blacks are more likely to be wrongly labeled high risk defendants and whites are more likely to be labeled as low risk. Unfortunately, in the most notable case relating to such software, the Wisconsin Supreme Court determined that because the state used the algorithm properly, because programs like this continually need to be rewritten to keep with the times and because the some of the factors going into the algorithm are publicly known, that the software is transparent enough to determine fair sentencing.
But by leaving the factors that make up such an algorithm a secret, a judge will never fully know what he is making his decision on and a Mira Mesa defense attorney will be unable to completely defend clients against the algorithm. Without full understanding of the programs, it’s possible their outcomes are partially based on assumptions that have already been outlawed by the courts as civil rights violations. With a lack of oversight on these tools, defense attorneys everywhere are urging their local, state and federal legislators to disallow these programs to be used in court until there are some fair evaluation methods to determine the legality and fairness of these algorithms.
For the time being, the San Diego County Probation Department does uses a computer generated risk assessment for felony sentencing. Fortunately, most judges also rely on the arguments of counsel in determining an appropriate sentence.
If you have any questions about these systems or want to know how they may affect you in an upcoming case, please call (760) 643-4050 or (858) 486-3024 to schedule a free initial consultation with top Carmel Valley criminal attorney Peter M. Liss.
Creative Commons Image by Wies van Erp