The uncertainty principle: variations on a theme
June 19, 2020 Β· Declared Dead Β· π Bulletin of the American Mathematical Society
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Avi Wigderson, Yuval Wigderson
arXiv ID
2006.11206
Category
math.FA
Cross-listed
cs.IT,
math-ph,
math.CO,
math.GR
Citations
47
Venue
Bulletin of the American Mathematical Society
Last Checked
1 month ago
Abstract
We show how a number of well-known uncertainty principles for the Fourier transform, such as the Heisenberg uncertainty principle, the Donoho--Stark uncertainty principle, and Meshulam's non-abelian uncertainty principle, have little to do with the structure of the Fourier transform itself. Rather, all of these results follow from very weak properties of the Fourier transform (shared by numerous linear operators), namely that it is bounded as an operator $L^1 \to L^\infty$, and that it is unitary. Using a single, simple proof template, and only these (or weaker) properties, we obtain some new proofs and many generalizations of these basic uncertainty principles, to new operators and to new settings, in a completely unified way. Together with our general overview, this paper can also serve as a survey of the many facets of the phenomena known as uncertainty principles.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β math.FA
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Tables of the existence of equiangular tight frames
R.I.P.
π»
Ghosted
Approximation spaces of deep neural networks
R.I.P.
π»
Ghosted
Sampling Theorems for Shift-invariant Spaces, Gabor Frames, and Totally Positive Functions
R.I.P.
π»
Ghosted
Eldan's Stochastic Localization and the KLS Conjecture: Isoperimetry, Concentration and Mixing
R.I.P.
π»
Ghosted
Equivalence of approximation by convolutional neural networks and fully-connected networks
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted