Search Header Logo
Equitable or not?

Equitable or not?

Assessment

Presentation

Computers

9th - 12th Grade

Easy

Created by

Frances Tee

Used 5+ times

FREE Resource

9 Slides • 3 Questions

1

Equitable or not?

Following up on Equity in AI

Slide image

2

Following up on Equitable Practice

In the movie Coded Bias, the documentary lifted scenarios where algorithms were used inequitable and harmed people's private and livelihood. We will review a few areas today.

3

Risk Assessment for Probation

A risk assessment that is used to inform a sentencing decision ought to predict whether the person being sentenced is going to commit a new crime at some point after the end of a period of supervision. And that decision should be based on whether people with similar profiles went on to commit crimes in the past. But while estimated data about such activities are available, those estimates are incomplete.

4

Risk Assesment Algorithms

Most risk assessment tools combine court and demographic records with some sort of questionnaire administered by a court official, such as a pretrial services officer in a bail context, or a prison social worker in a parole determination. Some tools, such as the Public Safety Assessment created by the Arnold Foundation (now known as Arnold Ventures), omit the questionnaire and use only the static data. The tools consider such things as criminal history, job status, level of education, and family information, giving each a numerical weight that makes the factor more or less important in the calculation of the final score.

5

Poll

Do you think the use of these types of tools to determine whether someone is likely to commit another crime is equitable?

Yes

No

Depends

6

Open Ended

Explain why you chose your answer to the previous question.

7

Slide image

8

source: https://epic.org/ai/criminal-justice/index.html

Automated decision-making tools are used widely and opaquely both directly in the criminal justice system and in ways that directly feed the criminal justice cycle in the U.S.


Affected people are often unable to know what tools the jurisdictions they live in use because of trade secret carveouts in Open Government laws, as well as similar roadblocks in evidentiary and discovery rules.

9

Criminalizing Algorithms

Criminalizing algorithms include algorithms used in housing, credit determinations, healthcare, hiring, schooling, and more. Many of these have been shown to make recommendations and decisions that negatively affect marginalized communities, encoding systemic racism, and contribute to entry of the Criminal Justice system. All of the other tools discussed here are affected by the results and data points produced by these criminalizing algorithms.

10

How does algorithmic bias arise?

Algorithmic bias may arise through a lack of suitable training data, or as a result of inappropriate system design or configuration.

11

Poll

A system that helps a bank decide whether or not to grant loans would typically be trained using a large data set of the bank’s previous loan decisions (and other relevant data to which the bank has access).

The system can compare a new loan applicant’s financial history, employment history and demographic information with corresponding information from previous applicants. From this, it tries to predict whether the new applicant will be able to repay the loan.

Equitable

Not Equitable

12

Not Equitable

One way in which algorithmic bias could arise in this situation is through unconscious biases from loan managers who made past decisions about mortgage applications.


If customers from minority groups were denied loans unfairly in the past, the AI will consider these groups’ general repayment ability to be lower than it is.


Young people, people of colour, single women, people with disabilities and blue-collar workers are just some examples of groups that may be disadvantaged.

Equitable or not?

Following up on Equity in AI

Slide image

Show answer

Auto Play

Slide 1 / 12

SLIDE