Explicit Renyi Entropy for Hidden Markov Models
Maciej Skorski, Joachim BreitnerDate & Time
01:00 am – 01:00 am
Determining entropy rates of stochastic processes is a fundamental but difficult problem, with closed-form solutions known only for specific cases. This paper pushes the state-of-the-art by solving the problem for Hidden Markov Models (HMMs) and Renyi entropies. While computation of Renyi entropy for Markov chains reduces to studying the growth of a simple matrix product, computations for HMMs involve \emph{products of random matrices}. As a result, this case is much harder and no explicit formulas have been known so far. In the finite-sample regime we circumvent this issue for Renyi entropy of integer orders, reducing the problem again to \emph{single matrix products} where the matrix is built from transition and emission probabilities by means of tensor product. To obtain results in the asymptotic setting, we use a novel technique for determining the growth of non-negative matrix powers. The classical approach -- Frobenius-Perron theory -- requires positivity assumptions; we instead work directly with the spectral formula. As a consequence, our results do not suffer from limitations such as irreducibility and aperiodicity. This improves our understanding of the entropy rate even for standard (unhidden) chains. A recently published side-channel attack against RSA was proven effective using our result.
01:00 am – 01:00 am
Renyi Entropy
Rényi Divergence rates of Ergodic Markov Chains: existence, explicit expressions and properties
On the Second- and Third-Order Asymptotics of Smooth Rényi Entropy and Their Applications
Optimum Source Resolvability Rate with Respect to f-Divergences Using the Smooth Renyi Entropy
On the Rényi Entropy of Log-Concave Sequences
Rényi Bounds on Information Combining
Explicit Renyi Entropy for Hidden Markov Models