Existence of an Unbiased Entropy Estimator for the Special Bernoulli Measure
https://doi.org/10.18255/1818-1015-2017-5-521-536
Abstract
Let \(\Omega = A^{N}\) be a space of right-sided infinite sequences drawn from a finite alphabet \(A = \{0,1\}\), \(N = \{1,2,\dots \}\),
\[\label{rho}
\rho(\boldsymbol{x},\boldsymbol{y}) =
\sum_{k=1}^{\infty}|x_{k} - y_{k}|2^{-k}
\]
a metric on \(\Omega = A^{N}\),
and \(\mu\) is a probability measure on \(\Omega\). Let \(\boldsymbol{\xi_0}, \boldsymbol{\xi_1}, \dots, \boldsymbol{\xi_n}\) be independent identically distributed points on \(\Omega\). We study the estimator \(\eta_n^{(k)}(\gamma)\) of the reciprocal of the entropy \(1/h\) that are defined as
\[ \label{etan}
\eta_n^{(k)}(\gamma) = k \left(r_{n}^{(k)}(\gamma) - r_{n}^{(k+1)}(\gamma)\right),\]
where
\[\label{def_r}
r_n^{(k)}(\gamma) =
\frac{1}{n+1}\sum_{j=0}^{n} \gamma\left(\min_{i:i \neq j} {^{(k)}}
\rho(\boldsymbol{\xi_{i}}, \boldsymbol{\xi_{j}})\right),
\]
\(\min ^{(k)}\{X_1,\dots,X_N\}= X_k\), if \(X_1\leq X_2\leq \dots\leq X_N\). The number \(k\) and the function \(\gamma(t)\) are auxiliary parameters.
The main result of this paper is
Theorem. Let \(\mu\) be the Bernoulli measure with probabilities \(p_0,p_1>0\), \(p_0+p_1=1\), \(p_0=p_1^2\). There exists a function \(\gamma(t)\) such that
\[E\eta_n^{(k)}(\gamma) = \frac1h.\]
About the Author
Evgeniy A. TimofeevRussian Federation
ScD, professor
References
1. Falconer K. J., Fractal geometry: Mathematical Foundation and Applications, John Wiley & Sons, NY, USA, 1990.
2. Gradshtein I. S., Ryzhik I. M., Table of integrals, Series, and Products, Fifth Edition, Academic Press, 1994.
3. Grassberger P., “Estimating the information content of symbol sequences and efficient codes”, IEEE Trans. Inform. Theory, 35 (1989), 669–675.
4. Hutchinson J. E., “Fractals and sel-similarity”, Indiana Univ. Math. J., 30 (1981), 713– 747.
5. Timofeev E. A., “Selection of a Metric for the Nearest Neighbor Entropy Estimators”, Journal of Mathematical Sciences, 203:6 (2014), 892–906.
6. Kaltchenko A., Timofeeva N., “Entropy Estimators with Almost Sure Convergence and an O(n −1 ) Variance”, Advances in Mathematics of Communications, 2:1 (2008), 1–13.
7. Kaltchenko A., Timofeeva N., “Rate of convergence of the nearest neighbor entropy estimator”, AEU – International Journal of Electronics and Communications, 64:1 (2010), 75–79.
8. Timofeeva N. E., “Construction of Entropy Estimator with Special Metric and Arbitrary Function”, Modeling and Analysis of Information Systems, 20:6 (2013), 174–178, (in Russian).
9. Timofeev E. A., “Bias of a nonparametric entropy estimator for Markov measures”, Journal of Mathematical Sciences, 176:2 (2011), 255–269.
10. Timofeev E. A., “Statistical Estimation of measure invariants”, St.Petersburg Math. J., 17:3 (2006), 527–551.
Review
For citations:
Timofeev E.A. Existence of an Unbiased Entropy Estimator for the Special Bernoulli Measure. Modeling and Analysis of Information Systems. 2017;24(5):521-536. (In Russ.) https://doi.org/10.18255/1818-1015-2017-5-521-536