Gaussianity and typicality in matrix distributional semantics
December 19, 2019 Β· Declared Dead Β· π Annales de l'Institut Henri PoincarΓ© D
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh, Lewis Sword
arXiv ID
1912.10839
Category
hep-th
Cross-listed
cs.CL,
math-ph
Citations
11
Venue
Annales de l'Institut Henri PoincarΓ© D
Last Checked
1 month ago
Abstract
Constructions in type-driven compositional distributional semantics associate large collections of matrices of size $D$ to linguistic corpora. We develop the proposal of analysing the statistical characteristics of this data in the framework of permutation invariant matrix models. The observables in this framework are permutation invariant polynomial functions of the matrix entries, which correspond to directed graphs. Using the general 13-parameter permutation invariant Gaussian matrix models recently solved, we find, using a dataset of matrices constructed via standard techniques in distributional semantics, that the expectation values of a large class of cubic and quartic observables show high gaussianity at levels between 90 to 99 percent. Beyond expectation values, which are averages over words, the dataset allows the computation of standard deviations for each observable, which can be viewed as a measure of typicality for each observable. There is a wide range of magnitudes in the measures of typicality. The permutation invariant matrix models, considered as functions of random couplings, give a very good prediction of the magnitude of the typicality for different observables. We find evidence that observables with similar matrix model characteristics of Gaussianity and typicality also have high degrees of correlation between the ranked lists of words associated to these observables.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β hep-th
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Searching the Landscape of Flux Vacua with Genetic Algorithms
R.I.P.
π»
Ghosted
Quantum stabilizer codes, lattices, and CFTs
R.I.P.
π»
Ghosted
Comments on the holographic description of Narain theories
R.I.P.
π»
Ghosted
Machine Learned Calabi-Yau Metrics and Curvature
R.I.P.
π»
Ghosted
Chaos and Complexity from Quantum Neural Network: A study with Diffusion Metric in Machine Learning
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted