RTI uses cookies to offer you the best experience online. By clicking “accept” on this website, you opt in and you agree to the use of cookies. If you would like to know more about how RTI uses cookies and how to manage them please view our Privacy Policy here. You can “opt out” or change your mind by visiting: http://optout.aboutads.info/. Click “accept” to agree.
Motivated by recent work of Joe (1989, Ann. Inst. Statist. Math., 41, 683-697), we introduce estimators of entropy and describe their properties. We study the effects of tail behaviour, distribution smoothness and dimensionality on convergence properties. In particular, we argue that root-n consistency of entropy estimation requires appropriate assumptions about each of these three features. Our estimators are different from Joe's, and may be computed without numerical integration, but it can be shown that the same interaction of tail behaviour, smoothness and dimensionality also determines the convergence rate of Joe's estimator. We study both histogram and kernel estimators of entropy, and in each case suggest empirical methods for choosing the smoothing parameter