Preview

Modeling and Analysis of Information Systems

Advanced search

Existence of an Unbiased Consistent Entropy Estimator for the Special Bernoulli Measure

https://doi.org/10.18255/1818-1015-2019-2-267-278

Abstract

Let \(\Omega = A^N\) - be a space of right-sided infinite sequences drawn from a finite alphabet \(A = \{0,1\}\), \(N = {1,2,\dots} \), \[\rho(\boldsymbol{x},\boldsymbol{y}) = \sum_{k=1}^{\infty}|x_{k} - y_{k}|2^{-k} \] - a metric on \(\Omega\), and \(\mu\) - a probability measure on \(\Omega\). Let \(\boldsymbol{\xi_0}, \boldsymbol{\xi_1}, \dots, \boldsymbol{\xi_n}\) - be independent identically distributed points on \(\Omega\). We study the estimator \(\eta_n^{(k)}(\gamma)\) - of the reciprocal of the entropy \(1/h\), that are defined as. \[\eta_n^{(k)}(\gamma) = k \left(r_{n}^{(k)}(\gamma) - r_{n}^{(k+1)}(\gamma)\right),\] where \[r_n^{(k)}(\gamma) =\frac{1}{n+1}\sum_{j=0}^{n} \gamma\left(\min_{i:i \neq j} {^{(k)}} \rho(\boldsymbol{\xi_{i}}, \boldsymbol{\xi_{j}})\right),\] \(\min ^{(k)}\{X_1,\dots,X_N\}= X_k\), если \(X_1\leq X_2\leq \dots\leq X_N\). Number \(k\) and a function \(\gamma(t)\) - are auxiliary parameters. The main result of this paper is

Theorem. Let \(m\) - be the Bernoulli measure with probabilities \(p_0,p_1>0\), \(p_0+p_1=1\), \(p_0=p_1^2\), then \(\forall eps>0\) some continuous function \(\gamma(t)\) such that \[
\left|E\eta_n^{(k)}(\gamma) - \frac1h\right| <eps,\quad DD\eta_n^{(k)}(\gamma)\to 0,n\to \infty. \]

About the Author

Evgeniy A. Timofeev
P.G. Demidov Yaroslavl State University
Russian Federation

ScD, professor.

Sovetskaya str., 14, Yaroslavl, 150003



References

1. Falconer K. J., Fractal geometry : Mathematical Foundation and Applications, John Wiley & Sons, NY USA, 1990.

2. Hutchinson J.E., “Fractals and self-similarity”, Indiana Univ. Math. J., 30 (1981), 713747.

3. Mauldin R. D. and Williams S.C., “Random Recursive Constructions: Asymptotic Geometric and Topological Properties”, Transactions of the American Mathematical Society, 295:1 (1986), 325-346.

4. Timofeev E.A., “Selection of a Metric for the Nearest Neighbor Entropy Estimators”, Journal of Mathematical Sciences, 203:6 (2014), 892-906.

5. Kaltchenko A., Timofeeva N., “Entropy Estimators with Almost Sure Convergence and an 0(n~1) Variance”, Advances in Mathematics of Communications, 2:1 (2008), 1-13.

6. Kaltchenko A., Timofeeva N., “Rate of convergence of the nearest neighbor entropy estimator”, AEU - International Journal of Electronics and Communications, 64:1 (2010), 75-79.

7. Timofeeva N. E., “Construction of Entropy Estimator with Special Metric and Arbitrary Function”, Modeling and Analysis of Information Systems, 20:6 (2013), 174--178, (in Russian).

8. Timofeev E. A., “Bias of a nonparametric entropy estimator for Markov measures”, Journal of Mathematical Sciences, 176:2 (2011), 255-269.

9. Timofeev E. A., “Statistical Estimation of measure invariants”, St.Petersburg Math. J., 17:3 (2006), 527-551.

10. Timofeev E. A., “Existence of an unbiased entropy estimator for the special Bernoulli measure”, Modeling and Analysis of Information Systems, 24:5 (2017), 521—536, (in Russian).

11. Timofeev E. A., "Existence of an Unbiased Consistent Entropy Estimator for the Special Bernoulli Measure", Modeling and Analysis of Information Systems, 26:2 (2019), 267-278.


Review

For citations:


Timofeev E.A. Existence of an Unbiased Consistent Entropy Estimator for the Special Bernoulli Measure. Modeling and Analysis of Information Systems. 2019;26(2):267-278. (In Russ.) https://doi.org/10.18255/1818-1015-2019-2-267-278

Views: 718


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1818-1015 (Print)
ISSN 2313-5417 (Online)