"Opus Dei Cum Pecunia Alienum Efficemus"


Click on the image to download and print our PDF. 

Click on the image to download and print our PDF. 

Age, housing, and mental health are the most widely used as variables by tools around the U.S. Age is particularly emphasized. For example, youth appears as three different factors in the pretrial assessment tool New Jersey is using as “age at current arrest,” “current violent offense,” and “20 years old or younger”. The pretrial interview in New Jersey also considers past adult and juvenile offenses, if any, from a person’s criminal history. A 22 year old and younger accused of a new violent criminal activity receives two points for their age and and an additional three points if they have a juvenile record. This individual’s raw score of 5 places them in a high risk bracket and a non-release during pretrial. A juvenile receiving a New Violent Criminal Activity (NVCA) will automatically be placed in a higher risk category than defendant age 22 or older accused of a violent offense.

At least one factor associated with age in used in 50 different counties we found in our research. These factors are phrased as “age at current arrest,” “current violent offense,” and “20 years old or younger.” The Public Safety Assessment (PSA), a tool that assigns a higher risk score to people 20 years of age or younger, the Colorado Pretrial Assessment Tool (CPAT), a tool that assigns a higher risk score to 24 or younger, and ORAS-PAT, a tool that assigns higher risk scores to 33 or younger. Risk assessment tools are conflating age with perceived heightened risk. In addition to relying on age as a factor, young people are also less likely to have advanced degrees, hold steady jobs, or own a home, personal data that would place someone in a higher risk category and are used in many courtrooms to decide pretrial release. One mistake in a person’s youth will follow them indefinitely when this error is embedded into an algorithm and their score. A young person, when examined under these parameters, are inherently considered higher risk even without ever encountering a police officer.


Why is a 21 year old be considered less risky than someone who is 20 years and 11 months old? Why is one zip code considered riskier than a different zip code one block over? Approximately 30% of the tools in our study are using “housing” as a variable. Tools refer to these variables as “Owning or Renting One's Residence,” “Contributing to Residential Payments,” and other similar terminology. Residential factors used in pretrial risk assessments include zip code, residence, and home ownership. Certain neighborhoods have been assigned risk levels since the 1930s. The Home Owners’ Loan Corporation (HOLC) drew maps assigning risk levels to different neighborhoods in cities across the country between 1935 and 1940. Using home ownership intended to restrict internal migration and home ownership policies in the U.S. decades earlier. Including housing factors will codify decades of unjust redlining practices that were in place long before most of us were born.

Clear Definitions

There is a lack of individual consent and public oversight over tools’ parameters, including definition of recidivism, level of accuracy, how final scores are shared with judges and other.

Add in the screenshot of the Oversight PDF here. 

Add in the screenshot of the Oversight PDF here. 

Interviewees have a different definitions of pretrial risk assessment standards around the country. These standards differ from state to state or county to county within the same state. In some cases developers work in conjunction with government officials to develop an automated decision making tool. Local governments could also purchase one from a corporation or receive a tool without cost directly from a foundation.

Pretrial risk assessment tools claim that they estimate the statistical chance of a person committing a new crime or missing a court date by measuring “recidivism.” Several jurisdictions we interviewed are defining recidivism as “any failure to appear,” “new criminal filing,” or “arrest for another crime” among others. Pretrial risk assessment tools are not measuring recidivism, these tools are using pre-existing data to forecast potential arrest. Tools may label a person “high risk” based on an ill defined outcome to predict an indefinite time in someone’s future. It is, therefore, crucial to understand who controls the tools and what their institutional interests are.

Jurisdictions interviewed also described recidivism timeframe as six months, three years, or a lifetime. There isn’t a national consensus on how to define recidivism and recidivism timeframe. Jurisdictions have their own rules and practices for defining recidivism and how to use results in the courtrooms with judges or magistrates during bail proceedings.  

Algorithms have not replaced humans and human bias in courtrooms or in the policies and practices of law enforcement.  Ultimately individuals are judged by both a biased machine and by a legal system steering the machine. Kentucky was one of the first states to require the use of a risk assessment during bail decisions. A slight increase in pretrial release occurred in the beginning, but decarceration subsided over time as judges returned to their previous habits.  Cook County in Illinois implemented General Order 18.8A,  a statute that ensures no one is incarcerated because they are unable to pay a monetary bond. A courtwatching report found judges have consistently failed to effectively followed this mandate.

Needs assessment or Risk assessment?

Mental health, substance, and juvenile are not receiving services, but are incarcerated at high rates.
PDF goes here!

PDF goes here!

Questions regarding housing stability implies individuals without homes are less likely to appear in court or are a risk to public safety. Studies have shown safe and stable housing could potentially reduce recidivism for people post-incarceration. Individuals should not be penalized uses the Ohio Risk Assessment System-Pretrial Assessment Tool (ORAS-PAT) to measure someone’s needs during pretrial and not for carceral decisions.

Mental health is among the four most common socioeconomic variables, along with housing situation, employment, and age among all tools. “Past or Current Mental Health Treatment,” “residential stability,” and “Unemployed at time of arrest.” Automated decision making tools are using health-socio-economic factors to categorize individuals into high risk categories even though treatment or hospitalization should be provided, not incarceration.


Independent audit and evaluation of tools do not occur, reports are not publicly available, etc.

Photo by monsitj/iStock / Getty Images
Photo by monsitj/iStock / Getty Images

Pretrial services agencies, police departments, public safety records, and jail and prison populations demographic statistics and records are often not presented on governmental agency websites. Our researchers primarily gathered from public reports, academic research, and periodicals, a large bulk of the information was often buried in tables, long documents, and technical terminology. In even more cases, the data simply wasn’t available: where was the data on race, gender, and class disparities in pretrial incarceration? Where was the reliable data on the impact of pretrial risk assessment tools on bail hearings? Information about risk assessment tools is extremely difficult to access. This information either did not exist or was not available, undermining public scrutiny and oversight.

PRA tools rely pre-existing police, court, and prison records. These records are oftenunavailable or inaccessible to the public. The public is unable review these records for missing, omitted, or inaccurate information or measure community impact, e.g., demographics of pretrial jail populations, electronic monitoring net-widening, or enumerate money bail assignments among others.

A majority of jurisdictions have not conducted an independent audit or have a publically available evaluation report. Evaluating or auditing the tool would require knowing at least the source code and access to input data. Specialized technical skills in statistics or computer science are essential, but understanding the direct impact of these algorithms on our daily lives, however, does not require advance statistical analysis or computer science knowledge. Several tool developers have published reports on tools prior to implementation or after a pilot stage. Few jurisdictions have followed up with subsequent evaluations.

Racial Bias

A majority of the tools we studied and found from secondary research do not explicitly use race as a variable.

However, information about the person’s socioeconomic background are factors that may embody racial and class bias, regardless of the intentions of those who created them. A majority of interviewees in our study replied their pretrial service agencies do not collect statistics on the racial makeup of their jail populations or have data on pretrial jail populations available for public consumption. Some interviewees described future plans  to mitigate racial biases within their jail populations. It is difficult to know for certain whether algorithms are increasing or decreasing pretrial jail populations and demographics from missing, omitted, or inaccurate data.

Photo by monsitj/iStock / Getty Images
Photo by monsitj/iStock / Getty Images

Professor Richard Berk at University of Pennsylvania is one of the authors in an academic report stating “...an overall conclusion will be that you can’t have it all. Rhetoric to the contrary, challenging tradeoffs are required between different kinds of fairness and between fairness and accuracy.”
Richard Berk’s study and other studies have shown that fairness of data used within algorithms and fairness of outcomes generated from the same algorithm are impossible to define through only a mathematical lens. Mathematicians can tell you the range of possible mutually exclusive definitions of fairness, but it could not decide “which” version of fairness should be implemented without guidance from governance or society. There is currently no policy that can be set that defines fairness for the system as a whole that can be mathematically achieved.