Bias-O-Meter
Congratulations on mastering identifying bias!
AP CSP · 5.3 5:00
Class Points: 0
AP CSP · Unit 5.3 · Computing & Society

Is Your AI Racist?

(Maybe. Let's find out in 5 minutes.)

5 Sections
3 Challenges
AP Exam Ready
0:00 – 1:00 · Warm-Up

Raise your hand if you've ever been recommended something online - a video, a song, an ad.

???

Here's the twist...

That algorithm recommending your videos? It's not neutral. It was built by humans, trained on human data, and it reflects human bias — whether anyone meant it to or not. Here are two real examples of algorithms causing real harm:

  • Amazon built a hiring AI that learned to downrank resumes containing the word "women's" (like "women's chess club") because it was trained on 10 years of mostly male hires.
  • Facial recognition used by US police departments misidentified Black faces up to 35 times more often than white faces, leading to wrongful arrests.
Hot take: A biased AI isn't evil. It's just a mirror reflecting biased humans and biased data.

So the question isn't "is the computer evil?" It's "where did the bias come from?"

1:00 – 2:00 · Core Concept

Computing Bias, Defined

Computing bias = when a system produces unfair or skewed results because of flawed data or flawed design.

Bias has three culprits. Every single time:

Data - who & what was included Design - how the system was built Humans - the decisions behind both
2:00 – 3:00 · Team Challenge · +25 pts

Honors Class AI Scenario

A school uses an AI to decide who gets into the honors class. The AI was trained on 10 years of past data - but historically, most honors students were athletes and STEM students.

What's happening here?

Split the room. Left side vs. right side. Vote:

3:00 – 4:00 · Quick-Fire Round · +10 pts each

Biased or Not? Click each one

Facial recognition works worse on darker skin tones.
Biased - Skewed training data
A calculator gives everyone the same answer.
Not biased - Pure logic, no data
A weather app gives the same forecast every time.
Not biased - Deterministic model
A hiring AI keeps rejecting women's resumes.
Biased - Historical data encoded bias