CryptoDB
Smoothing Out Binary Linear Codes and Worst-case Sub-exponential Hardness for LPN
Authors: |
|
---|---|
Download: |
|
Presentation: | Slides |
Conference: | CRYPTO 2021 |
Abstract: | Learning parity with noise (LPN) is a notorious (average-case) hard problem that has been well studied in learning theory, coding theory and cryptography since the early 90's. It further inspires the Learning with Errors (LWE) problem [Regev, STOC 2005], which has become one of the central building blocks for post-quantum cryptography and advanced cryptographic. Unlike LWE whose hardness can be reducible from worst-case lattice problems, no corresponding worst-case hardness results were known for LPN until very recently. At Eurocrypt 2019, Brakerski et al. [BLVW19] established the first feasibility result that the worst-case hardness of nearest codeword problem (NCP) (on balanced linear code) at the extremely low noise rate |
Video from CRYPTO 2021
BibTeX
@inproceedings{crypto-2021-31151, title={Smoothing Out Binary Linear Codes and Worst-case Sub-exponential Hardness for LPN}, publisher={Springer-Verlag}, doi={10.1007/978-3-030-84252-9_16}, author={Yu Yu and Jiang Zhang}, year=2021 }