Distributed Hypothesis Testing over a Noisy Channel: Error-exponents Trade-off

August 21, 2019 Β· Declared Dead Β· πŸ› Entropy

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Sreejith Sreekumar, Deniz GΓΌndΓΌz arXiv ID 1908.07521 Category stat.OT Cross-listed cs.IT Citations 3 Venue Entropy Last Checked 1 month ago
Abstract
A two-terminal distributed binary hypothesis testing problem over a noisy channel is studied. The two terminals, called the observer and the decision maker, each has access to $n$ independent and identically distributed samples, denoted by $\mathbf{U}$ and $\mathbf{V}$, respectively. The observer communicates to the decision maker over a discrete memoryless channel, and the decision maker performs a binary hypothesis test on the joint probability distribution of $(\mathbf{U},\mathbf{V})$ based on $\mathbf{V}$ and the noisy information received from the observer. The trade-off between the exponents of the type I and type II error probabilities is investigated. Two inner bounds are obtained, one using a separation-based scheme that involves type-based compression and unequal error-protection channel coding, and the other using a joint scheme that incorporates type-based hybrid coding. The separation-based scheme is shown to recover the inner bound obtained by Han and Kobayashi for the special case of a rate-limited noiseless channel, and also the one obtained by the authors previously for a corner point of the trade-off. Finally, we show via an example that the joint scheme achieves a strictly tighter bound than the separation-based scheme for some points of the error-exponents trade-off.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” stat.OT

Died the same way β€” πŸ‘» Ghosted