Shannon Entropy — Measuring Randomness
How random are the lottery results? Information entropy will show
Shannon entropy is a fundamental measure of randomness (chaos) from information theory. Maximum entropy means completely random results. Low entropy indicates the presence of patterns in "Супер 8" lottery results.
Analysis based on 20 draws from to
What is Shannon Entropy?
Information theory and lotteries
Shannon entropy is a measure of uncertainty (information content) of a random variable. Named after Claude Shannon, the founder of information theory (1948).
Formula
H = -Σ pᵢ · log₂(pᵢ)
- H — entropy in bits
- pᵢ — probability (frequency) of the i-th number
- H_max = log₂(N) — maximum entropy with N equally probable numbers
High entropy
All numbers appear approximately equally often. Results are hard to predict.
Low entropy
Some numbers appear significantly more often than others. There are potential patterns.
Sliding entropy
By analyzing entropy over a sliding window of N draws, you can see how lottery randomness changed over time. Dips may indicate periods with anomalous results.