Indistinguishability Obfuscation from Well-Founded Assumptions
August 21, 2020 · Declared Dead · 🏛 IACR Cryptology ePrint Archive
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Aayush Jain, Huijia Lin, Amit Sahai
arXiv ID
2008.09317
Category
cs.CR: Cryptography & Security
Cross-listed
cs.CC
Citations
235
Venue
IACR Cryptology ePrint Archive
Last Checked
1 month ago
Abstract
In this work, we show how to construct indistinguishability obfuscation from subexponential hardness of four well-founded assumptions. We prove: Let $τ\in (0,\infty), δ\in (0,1), ε\in (0,1)$ be arbitrary constants. Assume sub-exponential security of the following assumptions, where $λ$ is a security parameter, and the parameters $\ell,k,n$ below are large enough polynomials in $λ$: - The SXDH assumption on asymmetric bilinear groups of a prime order $p = O(2^λ)$, - The LWE assumption over $\mathbb{Z}_{p}$ with subexponential modulus-to-noise ratio $2^{k^ε}$, where $k$ is the dimension of the LWE secret, - The LPN assumption over $\mathbb{Z}_p$ with polynomially many LPN samples and error rate $1/\ell^δ$, where $\ell$ is the dimension of the LPN secret, - The existence of a Boolean PRG in $\mathsf{NC}^0$ with stretch $n^{1+τ}$, Then, (subexponentially secure) indistinguishability obfuscation for all polynomial-size circuits exists.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
📜 Similar Papers
In the same crypt — Cryptography & Security
R.I.P.
👻
Ghosted
R.I.P.
👻
Ghosted
Membership Inference Attacks against Machine Learning Models
R.I.P.
👻
Ghosted
The Limitations of Deep Learning in Adversarial Settings
R.I.P.
👻
Ghosted
Practical Black-Box Attacks against Machine Learning
R.I.P.
👻
Ghosted
Distillation as a Defense to Adversarial Perturbations against Deep Neural Networks
R.I.P.
👻
Ghosted
Extracting Training Data from Large Language Models
Died the same way — 👻 Ghosted
R.I.P.
👻
Ghosted
Language Models are Few-Shot Learners
R.I.P.
👻
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
👻
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
👻
Ghosted