<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.3 20210610//EN" "JATS-journalpublishing1-3.dtd">
<article article-type="research-article" dtd-version="1.3" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xml:lang="ru"><front><journal-meta><journal-id journal-id-type="publisher-id">mais</journal-id><journal-title-group><journal-title xml:lang="ru">Моделирование и анализ информационных систем</journal-title><trans-title-group xml:lang="en"><trans-title>Modeling and Analysis of Information Systems</trans-title></trans-title-group></journal-title-group><issn pub-type="ppub">1818-1015</issn><issn pub-type="epub">2313-5417</issn><publisher><publisher-name>Yaroslavl State University</publisher-name></publisher></journal-meta><article-meta><article-id pub-id-type="doi">10.18255/1818-1015-2017-5-521-536</article-id><article-id custom-type="elpub" pub-id-type="custom">mais-578</article-id><article-categories><subj-group subj-group-type="heading"><subject>Research Article</subject></subj-group><subj-group subj-group-type="section-heading" xml:lang="ru"><subject>Оригинальные статьи</subject></subj-group><subj-group subj-group-type="section-heading" xml:lang="en"><subject>Articles</subject></subj-group></article-categories><title-group><article-title>Существование несмещенной оценки энтропии для специальной меры Бернулли</article-title><trans-title-group xml:lang="en"><trans-title>Existence of an Unbiased Entropy Estimator for the Special Bernoulli Measure</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author" corresp="yes"><contrib-id contrib-id-type="orcid">https://orcid.org/0000-0002-3094-4390</contrib-id><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Тимофеев</surname><given-names>Евгений Александрович</given-names></name><name name-style="western" xml:lang="en"><surname>Timofeev</surname><given-names>Evgeniy A.</given-names></name></name-alternatives><bio xml:lang="ru"><p>доктор физ.-мат. наук, профессор</p></bio><bio xml:lang="en"><p>ScD, professor</p></bio><email xlink:type="simple">timofeevEA@gmail.com</email><xref ref-type="aff" rid="aff-1"/></contrib></contrib-group><aff-alternatives id="aff-1"><aff xml:lang="ru">Ярославский государственный университет им. П.Г. Демидова<country>Россия</country></aff><aff xml:lang="en">P.G. Demidov Yaroslavl State University<country>Russian Federation</country></aff></aff-alternatives><pub-date pub-type="collection"><year>2017</year></pub-date><pub-date pub-type="epub"><day>24</day><month>10</month><year>2017</year></pub-date><volume>24</volume><issue>5</issue><fpage>521</fpage><lpage>536</lpage><permissions><copyright-statement>Copyright &amp;#x00A9; Тимофеев Е.А., 2017</copyright-statement><copyright-year>2017</copyright-year><copyright-holder xml:lang="ru">Тимофеев Е.А.</copyright-holder><copyright-holder xml:lang="en">Timofeev E.A.</copyright-holder><license license-type="creative-commons-attribution" xlink:href="https://creativecommons.org/licenses/by/4.0/" xlink:type="simple"><license-p>This work is licensed under a Creative Commons Attribution 4.0 License.</license-p></license></permissions><self-uri xlink:href="https://www.mais-journal.ru/jour/article/view/578">https://www.mais-journal.ru/jour/article/view/578</self-uri><abstract><p>Пусть \(\Omega = A^{N}\)  - пространство правосторонних бесконечных последовательностей символов из алфавита \(A = \{0,1\}\),  \(N = \{1,2,\dots \}\), \[\label{rho}   \rho(\boldsymbol{x},\boldsymbol{y}) = \sum_{k=1}^{\infty}|x_{k} - y_{k}|2^{-k} \] - метрика на \(\Omega\) и \(\mu\) - вероятностная мера на \(\Omega\). Пусть \(\boldsymbol{\xi_0}, \boldsymbol{\xi_1}, \dots, \boldsymbol{\xi_n}\) - независимые случайные точки на \(\Omega\), распределенные по мере \(\mu\). Будем изучать оценку \(\eta_n^{(k)}(\gamma)\) величины обратной к энтропии \(1/h\), которая определяется следующим образом:</p><p>\[ \label{etan} \eta_n^{(k)}(\gamma) = k \left(r_{n}^{(k)}(\gamma) - r_{n}^{(k+1)}(\gamma)\right),\] где \[\label{def_r} r_n^{(k)}(\gamma) = \frac{1}{n+1}\sum_{j=0}^{n} \gamma\left(\min_{i:i \neq j} {^{(k)}} \rho(\boldsymbol{\xi_{i}}, \boldsymbol{\xi_{j}})\right), \] \(\min ^{(k)}\{X_1,\dots,X_N\}=  X_k\), if  \(X_1\leq X_2\leq \dots\leq X_N\). Число \(k\) и функция \(\gamma(t)\)  - вспомогательные параметры. Основной результат работы: Теорема. Пусть \(\mu\) -  мера Бернулли с вероятностями \(p_0,p_1&gt;0\), \(p_0+p_1=1\), \(p_0=p_1^2\), тогда существует функция \(\gamma(t)\) такая, что \[E\eta_n^{(k)}(\gamma) =  \frac1h.\]</p></abstract><trans-abstract xml:lang="en"><p>Let \(\Omega = A^{N}\)  be a space of right-sided infinite sequences drawn from a finite alphabet \(A = \{0,1\}\),  \(N = \{1,2,\dots \}\), \[\label{rho}   \rho(\boldsymbol{x},\boldsymbol{y}) = \sum_{k=1}^{\infty}|x_{k} - y_{k}|2^{-k} \] a metric on \(\Omega = A^{N}\), and \(\mu\) is a probability measure on \(\Omega\). Let \(\boldsymbol{\xi_0}, \boldsymbol{\xi_1}, \dots, \boldsymbol{\xi_n}\) be independent identically distributed points on \(\Omega\). We study the estimator \(\eta_n^{(k)}(\gamma)\) of the reciprocal of the entropy \(1/h\) that are defined as</p><p>\[ \label{etan} \eta_n^{(k)}(\gamma) = k \left(r_{n}^{(k)}(\gamma) - r_{n}^{(k+1)}(\gamma)\right),\] where \[\label{def_r} r_n^{(k)}(\gamma) = \frac{1}{n+1}\sum_{j=0}^{n} \gamma\left(\min_{i:i \neq j} {^{(k)}} \rho(\boldsymbol{\xi_{i}}, \boldsymbol{\xi_{j}})\right), \]  \(\min ^{(k)}\{X_1,\dots,X_N\}=  X_k\), if  \(X_1\leq X_2\leq \dots\leq X_N\). The number \(k\) and the function \(\gamma(t)\) are auxiliary parameters.</p><p>The main result of this paper is</p><sec><title>Theorem</title><p>Theorem. Let \(\mu\) be the Bernoulli measure  with probabilities \(p_0,p_1&gt;0\), \(p_0+p_1=1\), \(p_0=p_1^2\). There exists a function \(\gamma(t)\) such that \[E\eta_n^{(k)}(\gamma) =  \frac1h.\]</p></sec><sec><title> </title><p> </p></sec></trans-abstract><kwd-group xml:lang="ru"><kwd>мера</kwd><kwd>метрика</kwd><kwd>энтропия</kwd><kwd>оценка</kwd><kwd>несмещенность</kwd><kwd>самоподобие</kwd><kwd>мера Бернулли</kwd></kwd-group><kwd-group xml:lang="en"><kwd>measure</kwd><kwd>metric</kwd><kwd>entropy</kwd><kwd>estimator</kwd><kwd>unbias</kwd><kwd>self-similar</kwd><kwd>Bernoulli measure</kwd></kwd-group></article-meta></front><back><ref-list><title>References</title><ref id="cit1"><label>1</label><citation-alternatives><mixed-citation xml:lang="ru">Falconer K. J., Fractal geometry: Mathematical Foundation and Applications, John Wiley &amp; Sons, NY, USA, 1990.</mixed-citation><mixed-citation xml:lang="en">Falconer K. J., Fractal geometry: Mathematical Foundation and Applications, John Wiley &amp; Sons, NY, USA, 1990.</mixed-citation></citation-alternatives></ref><ref id="cit2"><label>2</label><citation-alternatives><mixed-citation xml:lang="ru">Gradshtein I. S., Ryzhik I. M., Table of integrals, Series, and Products, Fifth Edition, Academic Press, 1994.</mixed-citation><mixed-citation xml:lang="en">Gradshtein I. S., Ryzhik I. M., Table of integrals, Series, and Products, Fifth Edition, Academic Press, 1994.</mixed-citation></citation-alternatives></ref><ref id="cit3"><label>3</label><citation-alternatives><mixed-citation xml:lang="ru">Grassberger P., “Estimating the information content of symbol sequences and efficient codes”, IEEE Trans. Inform. Theory, 35 (1989), 669–675.</mixed-citation><mixed-citation xml:lang="en">Grassberger P., “Estimating the information content of symbol sequences and efficient codes”, IEEE Trans. Inform. Theory, 35 (1989), 669–675.</mixed-citation></citation-alternatives></ref><ref id="cit4"><label>4</label><citation-alternatives><mixed-citation xml:lang="ru">Hutchinson J. E., “Fractals and sel-similarity”, Indiana Univ. Math. J., 30 (1981), 713– 747.</mixed-citation><mixed-citation xml:lang="en">Hutchinson J. E., “Fractals and sel-similarity”, Indiana Univ. Math. J., 30 (1981), 713– 747.</mixed-citation></citation-alternatives></ref><ref id="cit5"><label>5</label><citation-alternatives><mixed-citation xml:lang="ru">Timofeev E. A., “Selection of a Metric for the Nearest Neighbor Entropy Estimators”, Journal of Mathematical Sciences, 203:6 (2014), 892–906.</mixed-citation><mixed-citation xml:lang="en">Timofeev E. A., “Selection of a Metric for the Nearest Neighbor Entropy Estimators”, Journal of Mathematical Sciences, 203:6 (2014), 892–906.</mixed-citation></citation-alternatives></ref><ref id="cit6"><label>6</label><citation-alternatives><mixed-citation xml:lang="ru">Kaltchenko A., Timofeeva N., “Entropy Estimators with Almost Sure Convergence and an O(n −1 ) Variance”, Advances in Mathematics of Communications, 2:1 (2008), 1–13.</mixed-citation><mixed-citation xml:lang="en">Kaltchenko A., Timofeeva N., “Entropy Estimators with Almost Sure Convergence and an O(n −1 ) Variance”, Advances in Mathematics of Communications, 2:1 (2008), 1–13.</mixed-citation></citation-alternatives></ref><ref id="cit7"><label>7</label><citation-alternatives><mixed-citation xml:lang="ru">Kaltchenko A., Timofeeva N., “Rate of convergence of the nearest neighbor entropy estimator”, AEU – International Journal of Electronics and Communications, 64:1 (2010), 75–79.</mixed-citation><mixed-citation xml:lang="en">Kaltchenko A., Timofeeva N., “Rate of convergence of the nearest neighbor entropy estimator”, AEU – International Journal of Electronics and Communications, 64:1 (2010), 75–79.</mixed-citation></citation-alternatives></ref><ref id="cit8"><label>8</label><citation-alternatives><mixed-citation xml:lang="ru">Tимофеева Н. Е., “Построение оценки энтропии для специальной метрики и произвольной функции”, Модел. и анализ информ. систем, 20:6 (2013), 174–178; [Timofeeva N. E., “Construction of Entropy Estimator with Special Metric and Arbitrary Function”, Modeling and Analysis of Information Systems, 20:6 (2013), 174–178, (in Russian)].</mixed-citation><mixed-citation xml:lang="en">Timofeeva N. E., “Construction of Entropy Estimator with Special Metric and Arbitrary Function”, Modeling and Analysis of Information Systems, 20:6 (2013), 174–178, (in Russian).</mixed-citation></citation-alternatives></ref><ref id="cit9"><label>9</label><citation-alternatives><mixed-citation xml:lang="ru">Timofeev E. A., “Bias of a nonparametric entropy estimator for Markov measures”, Journal of Mathematical Sciences, 176:2 (2011), 255–269.</mixed-citation><mixed-citation xml:lang="en">Timofeev E. A., “Bias of a nonparametric entropy estimator for Markov measures”, Journal of Mathematical Sciences, 176:2 (2011), 255–269.</mixed-citation></citation-alternatives></ref><ref id="cit10"><label>10</label><citation-alternatives><mixed-citation xml:lang="ru">Timofeev E. A., “Statistical Estimation of measure invariants”, St.Petersburg Math. J., 17:3 (2006), 527–551.</mixed-citation><mixed-citation xml:lang="en">Timofeev E. A., “Statistical Estimation of measure invariants”, St.Petersburg Math. J., 17:3 (2006), 527–551.</mixed-citation></citation-alternatives></ref></ref-list><fn-group><fn fn-type="conflict"><p>The authors declare that there are no conflicts of interest present.</p></fn></fn-group></back></article>
