AI in the Criminal Legal System: Can We Reduce Bias by Including Race?

Patrick K. Lin
6 min readJun 16, 2022

This article was originally published in my LinkedIn newsletter, Tech Support.

Affirmative action as we know it today was first introduced during the Kennedy administration. In 1961, President John F. Kennedy’s Executive Order 10925 created a committee with a vague mandate intended to communicate the administration’s commitment to fairness in employment. Affirmative action was used to instruct federal contractors to ensure companies the federal government did business with did not discriminate on the basis of race. Because the committee lacked any real enforcement mechanisms at the time, “affirmative action” meant to communicate to companies they should not discriminate, but not much else.

This committee eventually became the Equal Employment Opportunity Commission. The EEOC was created following the Civil Rights Act of 1964 for the purpose of enforcing Title VII, which prohibits discrimination in employment on the basis of race, color, religion, sex, or national origin.

At its core, the purpose of affirmative action is to begin to offset a group’s historical disadvantages by adjusting a decision-making process to be more favorable toward that group. Generally, this adjustment is achieved by incorporating that group identity as a positive factor in the decision-making process, then rebalancing the weight of the other factors. Affirmative action is an acknowledgement of human and institutional biases. After all, the history of affirmative action is intertwined with the history of American race relations, just as the history of American race relations is intertwined with the history of America.

Can we achieve racial equality by adopting race-blind processes?

In 1981, President Ronald Reagan’s Labor Department commissioned a report on increases in hiring among Black people. It found from 1974 to 1980, the rate of minority employment in businesses that contracted with the federal government, and were therefore subject to affirmative action requirements, rose by 20 percent. In businesses that did not contract with the government, the rates were 12 percent.

The finding was so contrary to what the Reagan administration had been saying about affirmative action the Labor Department hired an external consulting firm to check its own work. When the firm came back saying the methodology and conclusions were valid, Reagan refused to release the report, allowing politicians to go on telling the public that affirmative action didn’t work.

But the fact remains: affirmative action worked. It turns out racial diversity doesn’t happen on its own when institutions aren’t forced to do so. Who knew?

Affirmative action requires institutions to have some measure of demographic consciousness. To address historical disadvantages and biases, a system must actually classify members of society, recognize the disparate experiences of each group, and adjust accordingly.

Yet there is the notion that justice is supposed to be “blind.” But can we achieve racial equality by adopting race-blind processes? An algorithm built to predict recidivism among incarcerated populations is the subject of that debate.

Introducing… PATTERN

A risk assessment tool took center stage in the First Step Act, which Congress passed in 2018 with overwhelming bipartisan support. In addition to offering life skills classes and shortening some criminal sentences, it rewards people incarcerated in federal prisons with early release if they participate in programs designed to reduce their risk of reoffending. Potential candidates eligible for early release are identified using the Prisoner Assessment Tool Targeting Eliminated Risk and Needs, or PATTERN, which estimates an incarcerated person’s risk of committing a crime upon release.

Risk assessment tools are designed to predict a criminal defendant's risk for future misconduct. These predictions inform high-stakes judicial decisions, such as whether to incarcerate an individual before their trial.

Risk assessment tools are common in many states, where they are used to make decisions about parole and probation supervision, pretrial release, sentencing, and others. But PATTERN is the first time the federal criminal legal system is using an algorithm with such high stakes.

The First Step Act has been lauded as a step toward criminal justice reform that provides a clear path to reducing the population of low-risk nonviolent offenders while preserving public safety, it is not without problems. According to a review of PATTERN published by the Department of Justice in December 2021, PATTERN overpredicts the risk of recidivism for people of color relative to white people.

How does PATTERN work?

The PATTERN algorithm scores individuals based on different factors that have been shown to predict recidivism, such as criminal history, level of education, disciplinary incidents while incarcerated, and whether the incarcerated individual has completed any programs aimed at reducing recidivism, among others. The algorithm predicts both general and violent recidivism. It also does not take an individual’s race into account when calculating risk scores.

If you are interested in seeing how each risk factor raises or lowers a person’s risk score and whether they qualify for early release, you can check out the interactive version of PATTERN the Urban Institute developed:

The score PATTERN spits out categorizes an individual as high-, medium-, or low-risk. Only individuals considered low-risk are eligible for early release.

When the DOJ compared PATTERN predictions with actual outcomes of former incarcerated individuals, it found that the algorithm’s errors tended to disadvantage people of color. Compared to white individuals, PATTERN overpredicted general recidivism among:

  • Black men by 2 to 3 percent;
  • Black women by 6 to 7 percent;
  • Hispanic individuals by 2 to 6 percent; and
  • Asian men by 7 to 8 percent.

A tool intended to address some of the longstanding disparities in the criminal legal system could perpetuate existing racial disparities. Given that Black Americans are already incarcerated at nearly five times the rate of white Americans, these DOJ findings show that PATTERN is not yet ready for use.

Can we reduce bias by including race?

Although a term like “risk assessment tool” sounds scientific and technical, it is really just a series of policy decisions. Policies may take a race-blind approach or use other factors, but these factors ultimately act as proxies for race instead. Criminal history, for example, is not a reliable race-neutral indicator because law enforcement has a history of overpolicing many communities of color. Similarly, other factors such as education level and stable housing can intersect with race and ethnicity as well.

As a result, the predictive value of a piece of information about a person will depend on other information about them. For instance, the relationship between employment status and re-offending may be more pronounced in some racial groups compared to others. An algorithm that can take these differences into account will be more accurate.

However, accounting for these differences would require developers of tools like PATTERN to include each incarcerated person’s race in the algorithm, which raises legal concerns. Treating individuals differently on the basis of race in legal decision-making generally violates the Fourteenth Amendment of the U.S. Constitution, which guarantees equal protection under the law.

Yet the law allows the government to use racial categories in certain circumstances, like collecting demographic data on the census and describing criminal suspects. Deborah Hellman, a law professor at the University of Virginia, argues that designing algorithms that are sensitive to the ways race intersect with different data points may not be so different. More specifically, an algorithm can use race to determine what other factors are most impacted by an individual’s race.

The inclusion of race in tools like PATTERN can also improve outcomes for incarcerated people of color without making outcomes worse for white incarcerated people. That’s because earning credits toward early release is not a zero-sum game. One person’s eligibility for early release does not impact anyone else’s. This makes changing the algorithm to include race very different from programs like affirmative action, particularly in the employment and education contexts. When institutions make hiring and admission decisions, spots are limited, so hiring or admitting one individual means less spots for everyone else.

Still, data- or statistics-driven approaches by definition rely on statistical generalizations. This failure to treat incarcerated people as individuals raises issues of due process and fairness, making it difficult to wholeheartedly support algorithms like PATTERN, which still have a long way to go and may never fully belong in the criminal legal system.

Nonetheless, the problems PATTERN raise are relevant and significant. As our institutions continue to automate more and more high-stakes decisions — sentencing, housing, health care, mortgage approval, and even child welfare — we need to recognize that taking race out of the equation does not necessarily result in racial equality. In fact, acknowledging the ways institutions have treated individuals differently on the basis of race is essential to achieving results that are more fair and more just.

--

--

Patrick K. Lin

Patrick K. Lin is a New York City-based author focused on researching technology law and policy, artificial intelligence, surveillance, and data privacy.