Methods and Theory : Entropy Bounds
Overview
Shannon's differential entropy is a quantity that characterizes the uncertainty in a random variable or its probability distribution. While a number of methods exist to estimate the entropy of a distribution from a sample, there are no previous results establishing confidence bounds for entropy estimates. In this work, we offer the first probabilistic upper bound on the entropy of an unknown distribution (of any form) from a sample of that distribution.
Faculty
Collaborators
- Joseph DeStefano
Publications
- Erik Learned-Miller and Joseph DeStefano.
A probabilistic upper bound on differential entropy.
IEEE Transactions on Information Theory, Volume 54, Number 11, pp. 5223-5230, 2008.
[pdf] - Joseph DeStefano, Qifeng Lu, and Erik Learned-Miller.
A probabilistic upper bound on differential entropy.
UMass Amherst Technical Report 05-12, 2005.
[pdf]