Methods and Theory : Entropy Bounds


Shannon's differential entropy is a quantity that characterizes the uncertainty in a random variable or its probability distribution. While a number of methods exist to estimate the entropy of a distribution from a sample, there are no previous results establishing confidence bounds for entropy estimates. In this work, we offer the first probabilistic upper bound on the entropy of an unknown distribution (of any form) from a sample of that distribution.



  • Joseph DeStefano


  • Erik Learned-Miller and Joseph DeStefano.
    A probabilistic upper bound on differential entropy.
    IEEE Transactions on Information Theory, Volume 54, Number 11, pp. 5223-5230, 2008.
  • Joseph DeStefano, Qifeng Lu, and Erik Learned-Miller.
    A probabilistic upper bound on differential entropy.
    UMass Amherst Technical Report 05-12, 2005.