HN
Today

What's the Entropy of a Random Integer?

The author dives into the seemingly simple question of a random integer's entropy, quickly connecting prime factorization to the cycle structure of random permutations. This mathematical exploration highlights deep links between number theory and probability, offering a raw look at a mathematician's thought process. It exemplifies the kind of intricate problem-solving that appeals to Hacker News's technically inclined audience.

9
Score
0
Comments
#15
Highest Rank
4h
on Front Page
First Seen
Feb 9, 7:00 PM
Last Seen
Feb 9, 10:00 PM
Rank Over Time
15161920

The Lowdown

The author embarks on an exploration of a seemingly straightforward, yet profoundly complex, mathematical query: what is the entropy of a random integer? This investigation swiftly transitions from basic number theory into advanced concepts, drawing connections between prime factorizations, the cycle structure of random permutations, and the Poisson-Dirichlet process.

  • The central problem involves defining a probability distribution based on the prime factors of a random integer n from a given range [N, 2N], specifically using the proportion (a_i log p_i / log n).
  • To simplify, the author suggests restricting the problem to squarefree integers, which allows for an analogy to the cycle lengths of a random permutation on N letters.
  • This analogy relies on the Poisson-Dirichlet process, which is known to govern the distribution of large prime factors in random integers and cycle lengths in random permutations.
  • An attempt is made to estimate the mean entropy by summing terms derived from the number of i-cycles (X_i) in a permutation, each contributing (i/N)(log N – log i).
  • Through a calculation involving the expected value of X_i (approximately 1/i) and Stirling's approximation, the estimated mean entropy surprisingly converges towards 1.
  • The author concludes by posing further open questions, such as whether this result holds for general integers, if the entropy itself converges to a distribution, and what the properties of 'perplexity' (the exponential of entropy) might be.

Ultimately, the article serves as an authentic representation of a mathematician's casual investigative process, revealing how an initial, simple question can expose intricate interdependencies and lead to more complex, unsolved problems across various mathematical domains.