Recurrent Neural Networks as Weighted Language Recognizers

November 15, 2017 ยท The Ethereal ยท ๐Ÿ› North American Chapter of the Association for Computational Linguistics

๐Ÿ”ฎ THE ETHEREAL: The Ethereal
Pure theory โ€” exists on a plane beyond code

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Yining Chen, Sorcha Gilroy, Andreas Maletti, Jonathan May, Kevin Knight arXiv ID 1711.05408 Category cs.FL: Formal Languages Cross-listed cs.CC, cs.CL Citations 78 Venue North American Chapter of the Association for Computational Linguistics Last Checked 1 month ago
Abstract
We investigate the computational complexity of various problems for simple recurrent neural networks (RNNs) as formal models for recognizing weighted languages. We focus on the single-layer, ReLU-activation, rational-weight RNNs with softmax, which are commonly used in natural language processing applications. We show that most problems for such RNNs are undecidable, including consistency, equivalence, minimization, and the determination of the highest-weighted string. However, for consistent RNNs the last problem becomes decidable, although the solution length can surpass all computable bounds. If additionally the string is limited to polynomial length, the problem becomes NP-complete and APX-hard. In summary, this shows that approximations and heuristic algorithms are necessary in practical applications of those RNNs.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Formal Languages