Shannon entropy measures the average information content or uncertainty in a probability distribution.
Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Minimum entropy (0) occurs when one outcome is certain.