Black Box

Module: ethics

What it is

A black box system is one where you can see inputs and outputs but not the internal process. Most AI models are black boxes—you can't fully explain why they produced a specific output. The complexity of billions of parameters makes complete understanding impossible.

Why it matters

Black box AI creates accountability challenges. When AI makes a consequential decision, explaining why can be impossible. This matters for regulated domains like healthcare, finance, and criminal justice where decisions must often be explained and justified.