Information Theory Simulations
Explore the mathematical foundations of data, communication, and compression through interactive visualizations of Shannon entropy, coding theory, and information measures.
Shannon Entropy
Calculate information entropy for different probability distributions. Adjust probabilities and see how uncertainty changes.
02Huffman Coding
Build optimal prefix-free code trees. Watch the algorithm construct variable-length codes based on symbol frequencies.
03LZ77 Compression
Visualize sliding window compression with dictionary matching. See how repeated patterns reduce data size.
04Hamming Codes
Detect and correct bit errors using parity checks. Flip bits and watch the decoder recover the original message.
05Channel Capacity
Explore Shannon's theorem for noisy channels. Adjust noise levels and see the maximum reliable transmission rate.
06Mutual Information
Venn diagrams of entropy relationships. Visualize how information is shared between random variables.
07Kolmogorov Complexity
Compare compressibility of patterns vs random strings. Explore the shortest description length concept.
08Source Coding
Compare fixed-length vs variable-length coding efficiency. See when compression saves space.
09Cryptographic Entropy
Analyze random number quality. Compare pseudo-random and true random sources for security applications.
10Information Bottleneck
Compress data while preserving relevant information. Balance compression ratio with prediction accuracy.