← Back to Gallery

Mutual Information

I(X;Y)
1.00
H(X,Y)
3.00
H(X|Y)
1.00
H(Y|X)
1.00

About Mutual Information

Mutual information I(X;Y) measures how much knowing one variable reduces uncertainty about the other.

I(X;Y) = H(X) + H(Y) - H(X,Y)
I(X;Y) = H(X) - H(X|Y)

When X and Y are independent, I(X;Y) = 0. When perfectly correlated, I(X;Y) = min(H(X), H(Y)).