statistics

Bayes' Theorem

Bayes' theorem reverses conditional probabilities: P(A|B) = P(B|A)P(A)/P(B). The foundation of Bayesian inference, medical testing, and ML.

Bayes' theorem relates conditional probabilities, letting you reverse the direction of conditioning:

P(AB)=P(BA)P(A)P(B)P(A \mid B) = \frac{P(B \mid A) P(A)}{P(B)}

Given prior P(A)P(A) (your belief before evidence) and likelihood P(BA)P(B \mid A), compute the posterior P(AB)P(A \mid B) — your updated belief after seeing BB.

Classic medical-test example: disease prevalence 1%, test sensitivity 99%, false-positive rate 1%. The probability of disease given a positive test:

0.990.010.990.01+0.010.99=12\frac{0.99 \cdot 0.01}{0.99 \cdot 0.01 + 0.01 \cdot 0.99} = \frac{1}{2}

Despite a 99% accurate test, a positive result means only 50% chance of disease — because the disease is rare. The "base rate fallacy" (forgetting the prior) is the most common Bayes mistake.

Bayes powers Bayesian inference, naïve Bayes classifiers, spam filters, and forensic reasoning.