Variance measures how far a dataset's values spread from the mean. For a population of values with mean :
For a sample of values with sample mean , divide by instead of (Bessel's correction, an unbiased estimator).
A small variance means values cluster near the mean; a large variance means they are scattered. Variance is in squared units of the original data (kg² if data is in kg) — that's why we usually report standard deviation , which has the same units as the data.
Variance underlies all of inferential statistics: confidence intervals, hypothesis tests, and regression all depend on estimating variance. The bias-variance tradeoff in machine learning is named for it.