The Math

The Math

The Science of Chaos: How We Verified Quantum Randomness

When we set out to build a true quantum entropy engine, we didn't want a standard random number generator (RNG) that relies on a math formula looping in a microchip. We wanted true entropy.

We built a circuit that detects, amplifies, and measures electrons quantum tunneling through a diode depletion region - something that is impossible in a classical standard model. 

We ran our device through a gauntlet of statistical tests (100,000 rolls). Here is the math, the history, and the results behind the quantum entropy engine. 

1. Pearson’s Chi-Squared Test ()

The "Fairness" Test

  • The History: Developed by Karl Pearson in 1900, this was one of the founding moments of modern statistics. Pearson needed a way to determine if a set of observed data () deviated significantly from what was theoretically expected ().

  • What it checks: If we roll a D20 50,000 times, we expect each number to appear roughly 2,500 times. If the number "20" appears 3,000 times, the math will flag it as a biased die.

  • Our Result: PASS ()

    • Translation: The distribution of our rolls is consistent with a perfect theoretical die. There is zero evidence of bias toward any specific number.

2. The Wald-Wolfowitz Runs Test

The "Clumping" Test

  • The History: Abraham Wald and Jacob Wolfowitz published this test in 1940. They wanted to check if two samples came from the same population, but it became the gold standard for detecting patterns in a sequence.

  • What it checks: Do high numbers clump together? Does the die "get stuck"? This test looks at the sequence of the rolls. If you flip a coin and get H H H H H H T T T T T T, you have 50/50 odds, but it’s not random—it’s sorted. The Runs Test detects that.

  • Our Result: PASS ()

    • Translation: A Z-score of 0.40 is near-perfect. It means our High/Low sequences are indistinguishable from a random coin flip.

3. Shannon Entropy

The "Information" Test

  • The History: In 1948, Claude Shannon published "A Mathematical Theory of Communication," founding the Digital Age. He defined "entropy" as the amount of uncertainty in a signal.

  • What it checks: This measures the information density of the stream. A completely predictable die (always rolls 20) has 0 bits of entropy. A perfect D20 should have bits.

  • Our Result: 4.3218 bits (99.997% Efficiency)

    • Translation: Every bit generated by the quantum entropy engine is practically pure, unpredictable noise.

4. The Serial Test (Bigrams)

The "Prediction" Test

  • The Math: A Goodness-of-Fit test on pairs .

  • The History: While rooted in Pearson's work, the application of this test to computer random number generators was popularized by Donald Knuth in his seminal work The Art of Computer Programming (Vol 2).

  • What it checks: Does a specific number predict the next number? (e.g., "Does a 5 usually follow a 20?"). We analyze all 400 possible pairs of numbers (1-1, 1-2... 20-20) to ensure no patterns exist.

  • Our Result: PASS ()

    • Translation: There is no hidden pattern linking one roll to the next. The past does not predict the future.

By measuring the quantum fluctuations of electron flow, we have captured the fundamental chaos of the universe in a box.