Smoothing the gap between NP and ER
December 04, 2019 Β· Declared Dead Β· π IEEE Annual Symposium on Foundations of Computer Science
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Jeff Erickson, Ivor van der Hoog, Tillmann Miltzow
arXiv ID
1912.02278
Category
cs.CG: Computational Geometry
Cross-listed
cs.CC,
cs.DM,
cs.DS,
math.NA
Citations
55
Venue
IEEE Annual Symposium on Foundations of Computer Science
Last Checked
1 month ago
Abstract
We study algorithmic problems that belong to the complexity class of the existential theory of the reals (ER). A problem is ER-complete if it is as hard as the problem ETR and if it can be written as an ETR formula. Traditionally, these problems are studied in the real RAM, a model of computation that assumes that the storage and comparison of real-valued numbers can be done in constant space and time, with infinite precision. The complexity class ER is often called a real RAM analogue of NP, since the problem ETR can be viewed as the real-valued variant of SAT. In this paper we prove a real RAM analogue to the Cook-Levin theorem which shows that ER membership is equivalent to having a verification algorithm that runs in polynomial-time on a real RAM. This gives an easy proof of ER-membership, as verification algorithms on a real RAM are much more versatile than ETR-formulas. We use this result to construct a framework to study ER-complete problems under smoothed analysis. We show that for a wide class of ER-complete problems, its witness can be represented with logarithmic input-precision by using smoothed analysis on its real RAM verification algorithm. This shows in a formal way that the boundary between NP and ER (formed by inputs whose solution witness needs high input-precision) consists of contrived input. We apply our framework to well-studied ER-complete recognition problems which have the exponential bit phenomenon such as the recognition of realizable order types or the Steinitz problem in fixed dimension.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Computational Geometry
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Dynamic Planar Convex Hull
R.I.P.
π»
Ghosted
TEMPO: Feature-Endowed TeichmΓΌller Extremal Mappings of Point Clouds
R.I.P.
π»
Ghosted
Explainable Artificial Intelligence for Manufacturing Cost Estimation and Machining Feature Visualization
R.I.P.
π»
Ghosted
Coresets for Clustering in Euclidean Spaces: Importance Sampling is Nearly Optimal
R.I.P.
π»
Ghosted
Momen(e)t: Flavor the Moments in Learning to Classify Shapes
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted