[1] Marsiglietti A, Melbourne J. On the entropy power inequality for the Rényi entropy of order $[0, 1]$. IEEE Trans Inf Theory, 2019, 65(3): 1387-1396 [2] Courtade T A, Han G, Wu Y. Counterexample to the vector generalization of Costa's entropy power inequality,Partial Resolution. IEEE Trans Inf Theory, 2018, 64(7): 5453-5454 [3] Li T, Wu X. Quantum query complexity of entropy estimation. IEEE Trans Inf Theory, 2019, 65(5): 2899-2921 [4] Jaye B, Livshyts G V, Paouris G, Pivovarov P. Remarks on the Rényi entropy of a sum of IID random variables. IEEE Trans Inf Theory, 2020, 66(5): 2898-2903 [5] Song Z, Zhang J. A note for estimation about average differential entropy of continuous bounded space-time random field. Chinese Journal of Electronics China, 2022, 31(5): 793-803 [6] Zhang J. Jeffreys divergence and generalized Fisher information measures on Fokker-Planck time random field. Entropy, 2023, 25(10): 1445 [7] Györfi L, van der Meulen E C. Density-free convergence properties of various estimators of entropy. Comput Stat Data Anal, 1987, 5(4): 425-436 [8] Györfi L, van der Meulen E C. On the nonparametric estimation of the entropy functional// Roussas G. eds. Nonparametric Functional Estimation and Related Topics. Dordrecht: Springer, 1991: 81-95 [9] Forte B, Hughes W. The maximum entropy principle: a tool to define new entropies. Rep Math Phys, 1988, 26(2): 227-235 [10] Lee S, Vonta I, Karagrigoriou A. A maximum entropy type test of fit. Comput Stat Data Anal, 2011, 55(9): 2635-2643 [11] Kolmogorov A N. The local structure of turbulence in incompressible viscous fluid for very large Reynolds numbers. Dokl Akad Nauk SSSR, 1941, 30: 299-303 [12] Kolmogorov A N. On the degeneration of isotropic turbulence in an incompressible viscous flu. Dokl Akad Nauk SSSR, 1941, 31: 538-542 [13] Kolmogorov A N. Dissipation of energy in isotropic turbulence. Dokl Akad Nauk SSSR, 1941, 32: 19-21 [14] Balakrishnan A V. A note on the sampling principle for continuous signals. IEEE Trans Inf Theory, 1957, 3(2): 143-146 [15] Ye Z, Berger T.Information Measures for Discrete Random Fields. Beijing/New York: Science Press, 1998 [16] Ye Z.On Entropy and $\varepsilon$-entropy of Random Fields [D]. New York: Cornell University, 1989 [17] Zhang Z, Yeung R W. On characterization of entropy functions via information inequalities. IEEE Trans Inf Theory, 1998, 44(4): 1440-1452 [18] Yeung R W, Chen C, Chen Q, Moulin P. On information-theoretic characterizations of Markov random fields and subfields. IEEE Trans Inf Theory, 2019, 65(3): 1493-1511 [19] Zhang F X, Qian M. Entropy production rate of the minimal diffusion process. Acta Math Sci, 2007, 27B(1): 145-152 [20] Xiong S F, Li G Y. Dispersion comparisons of two probability vectors under multinomial sampling. Acta Math Sci, 2010, 30B(3): 907-918 [21] Sun J Q. Rate of convergence and expansion of Rényi entropic central limit theorem. Acta Math Sci, 2015, 35B(1): 79-88 [22] Sun P. On the entropy of flows with reparameterized gluing orbit property. Acta Math Sci, 2020, 40B(3): 855-862 [23] Guang X, Fu F W.The average failure probabilities of random linear network coding. IEICE T Fund Electr, 2011, E94-A(10): 1991-2001 [24] Guang X, Fu F W, Zhang Z. Construction of network error correction codes in packet networks. IEEE Trans Inf Theory, 2012, 59(2): 1030-1047 [25] She R, Fan P Y, Liu X Y, Wang X. Interpretable Generative Adversarial Networks With Exponential Function. IEEE Trans Signal Process, 2021, 69: 3854-3867 [26] She R, Liu S, Fan P Y. Attention to the variation of probabilistic events: information processing with message importance measure. Entropy, 2019, 21(5): 439 [27] Van der Vaart A W. Asymptotic Statistics. Cambridge: Cambridge University Press, 2000 [28] Adler R J, Taylor J E.Random Fields and Geometry. New York: Springer, 2007 |