Statistics Seminar - 11/13/25

Nov 13 3:40 pm
Speaker

Dr. Jialin Zhang, Assistant Professor of Statistics, Department of Mathematics and Statistics, Mississippi State University

Title

Statistics Seminar Series

Subtitle

Unfolding Generalized Shannon’s Entropy

Physical Location

Allen 411

Abstract:

Shannon’s entropy is a cornerstone of information theory, quantifying uncertainty within a probability distribution. However, the classical definition may fail for distributions with heavy tails or infinite alphabets, leaving gaps in its theoretical foundation. This talk introduces a framework called Generalized Shannon’s Entropy (GSE), which extends the original concept to ensure well-definedness and robustness under broader conditions.

The talk begins by revisiting Shannon’s entropy and its limitations, followed by the construction of the GSE through escort distributions that adjust tail behavior. The asymptotic properties of plug-in estimators for GSE are discussed, including a central limit theorem that requires minimal assumptions. The talk then connects this generalization to mutual information, leading to tests of independence on a contingency table with asymptotic normality.

The second half explores the role of GSE in characterizing discrete probability distributions. Several recent results are reviewed, showing how finite or countable sets of entropic quantities can uniquely determine a distribution up to permutation. The talk concludes with open directions toward developing goodness-of-fit tests for discrete and heterogeneous sample spaces using finite-order GSE characterization.

Note:

Contact Prof. JZ at jzhang@math.msstate.edu for additional information.