[1] Artstein S, Ball K M, Barthe F, et al. Solution of Shannon’s problem on the monotonicity of entropy. J
Amer Math Soc, 2004, 17: 975–982
[2] Artstein S, Ball K M, Barthe F, et al. On the rate of convergence in the entropic central limit theorem.
Probab Theory Relat Fields, 2004, 129: 381–390
[3] Aubrun G, Szarek S, Werner E. Nonadditivity of R´enyi entropy and Dvoretzky’s theorem. J Math Phys,
2001, 51: 022102
[4] Barron A R. Entropy and the central limit theorem. Ann Probab, 1986, 14: 336–342
[5] Bhattacharya R N, Ranga Rao R. Normal Approximation and Asymptotic Expansions. New York: John
Wiley & Sons, Inc, 1976
[6] Bobkov S G, Chistyakov G P, G¨otze F. Rate of convergence and edgeworth-type expansion in the entropic
central limit theorem. Ann Prob, 2013, 31(4): 2479–2512
[7] Erven V T. Harrenmo¨es P. R´enyi divergence and Kullback-Leibler divergence. 2012, arXiv: 1206.2459
[8] Johnson O. In Formation Theory and the Central Limit Theorem. Imperical College Press, 2004
[9] Johnson O, Barron A. Fisher Information inequalities and the central limit theorem. Probab Theory Relat
Fields, 2004, 129: 391–409
[10] Johnson O, Vignat C. Some results concerning maximum R´enyi entropy distributions. Ann Inst H Poincar
R´enyi Probab Statist, 2007, 43: 339–351
[11] Linnik J V. An information theoretic proof of the central limit theorem with Lindeberg conditions. Theory
Probab Appl, 1959, 4: 288–299
[12] Lutwak E, Yang D, Zhang G. Cramer-Rao and moment-entropy inequalities for R´enyi entropy and generalized
Fisher information. IEEE Trans Inform Theory, 2005, 51: 473–478
[13] Madiman M, Barron A R. Generalized entropy power inequalities and monotonicity properties of information.
IEEE Transactions on Information Theory, 2007, 53: 2317–2329
[14] Petrov V V. Sums of Independent Random Variables. Springer-Verlag, 1975: 206–206
[15] R´enyi A. On measures of information and entropy//Proceedings of the 4th Berkeley Symposium on Mathematics,
Statistics and Probability. 1960: 547–561
[16] Shannon C E, Weaver W W. A Mathematical Theory of Communication. Urbana, IL: University of Illinois
Press, 1949
[17] Tulino A M, Verdu S. Monotonic decrease of the non-Gaussianness of the sum of independent random
variables: a simple proof. IEEE Trans Information Theory, 2006, 52: 4295–4297 |