Consider a Markov process with state space , which jumps continuously to a new state chosen uniformly at random and regardless of the previous state. The collection of transition kernels (indexed by time ) is the Potts semigroup. Diaconis and Saloff-Coste computed the maximum of the ratio of the relative entropy and the Dirichlet form obtaining the constant in the -log-Sobolev inequality (-LSI). In this paper, we obtain the best possible non-linear inequality relating entropy and the Dirichlet form (i.e., -NLSI, ). As an example, we show . The more precise NLSIs have been shown by Polyanskiy and Samorodnitsky to imply various geometric and Fourier-analytic results. Beyond the Potts semigroup, we also analyze Potts channels -- Markov transition matrices constant on and off diagonal. (Potts semigroup corresponds to a (ferromagnetic) subset of matrices with positive second eigenvalue). By integrating the -NLSI we obtain the new strong data processing inequality (SDPI), which in turn allows us to improve results on reconstruction thresholds for Potts models on trees. A special case is the problem of reconstructing color of the root of a -colored tree given knowledge of colors of all the leaves. We show that to have a non-trivial reconstruction probability the branching number of the tree should be at least \frac{\log k}{\log k - \log(k-1)} = (1-o(1))k\log k. This extends previous results (of Sly and Bhatnagar et al.) to general trees, and avoids the need for any specialized arguments. Similarly, we improve the state-of-the-art on reconstruction threshold for the stochastic block model with balanced groups, for all . These improvements advocate information-theoretic methods as a useful complement to the conventional techniques originating from the statistical physics.
View on arXiv