Information theory with finite vector spaces

July 13, 2018 ยท Declared Dead ยท ๐Ÿ› IEEE Transactions on Information Theory

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Juan Pablo Vigneaux arXiv ID 1807.05152 Category math-ph Cross-listed cs.IT, math.PR Citations 6 Venue IEEE Transactions on Information Theory Last Checked 1 month ago
Abstract
Whereas Shannon entropy is related to the growth rate of multinomial coefficients, we show that the quadratic entropy (Tsallis 2-entropy) is connected to their $q$-deformation; when $q$ is a prime power, these $q$-multinomial coefficients count flags of finite vector spaces with prescribed length and dimensions. In particular, the $q$-binomial coefficients count vector subspaces of given dimension. We obtain this way a combinatorial explanation for the nonadditivity of the quadratic entropy, which arises from a recursive counting of flags. We show that statistical systems whose configurations are described by flags provide a frequentist justification for the maximum entropy principle with Tsallis statistics. We introduce then a discrete-time stochastic process associated to the $q$-binomial probability distribution, that generates at time $n$ a vector subspace of $\mathbb{F}_q^n$ (here $\mathbb{F}_q$ is the finite field of order $q$). The concentration of measure on certain "typical subspaces" allows us to extend the asymptotic equipartition property to this setting. The size of the typical set is quantified by the quadratic entropy. We discuss the applications to Shannon theory, particularly to source coding, when messages correspond to vector spaces.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” math-ph

Died the same way โ€” ๐Ÿ‘ป Ghosted