International Association for Cryptologic Research

International Association
for Cryptologic Research

CryptoDB

Nico Döttling

Publications

Year
Venue
Title
2024
EUROCRYPT
Two-Round Maliciously-Secure Oblivious Transfer with Optimal Rate
We give a construction of a two-round batch oblivious transfer (OT) protocol in the CRS model that is UC-secure against malicious adversaries and has (near) optimal communication cost. Specifically, to perform a batch of $k$ oblivious transfers where the sender's inputs are bits, the sender and the receiver need to communicate a total of $3k + o(k) \cdot \mathsf{poly}(\lambda)$ bits. We argue that $3k$ bits are required by any protocol with a black-box and straight-line simulator. The security of our construction is proven assuming the hardness of Quadratic Residuosity (QR) and the Learning Parity with Noise (LPN).
2023
PKC
Laconic Function Evaluation for Turing Machines
Laconic function evaluation (LFE) allows Alice to compress a large circuit C into a small digest d. Given Alice’s digest, Bob can encrypt some input x under d in a way that enables Alice to recover C(x), without learning anything beyond that. The scheme is said to be laconic if the size of d, the runtime of the encryption algorithm, and the size of the ciphertext are all sublinear in the size of C. Until now, all known LFE constructions have ciphertexts whose size depends on the depth of the circuit C, akin to the limitation of levelled homomorphic encryption. In this work we close this gap and present the first LFE scheme (for Turing machines) with asymptotically optimal parameters. Our scheme assumes the existence of indistinguishability obfuscation and somewhere statistically binding hash functions. As further contributions, we show how our scheme enables a wide range of new applications, including two previously unknown constructions: – Non-interactive zero-knowledge (NIZK) proofs with optimal prover complexity. – Witness encryption and attribute-based encryption (ABE) for Turing machines from falsifiable assumptions.
2023
EUROCRYPT
Efficient Laconic Cryptography from Learning With Errors
Laconic cryptography is an emerging paradigm that enables cryptographic primitives with sublinear communication complexity in just two messages. In particular, a two-message protocol between Alice and Bob is called \emph{laconic} if its communication and computation complexity are essentially independent of the size of Alice's input. This can be thought of as a dual notion of fully-homomorphic encryption, as it enables ``Bob-optimized'' protocols. This paradigm has led to tremendous progress in recent years. However, all existing constructions of laconic primitives are considered only of \emph{theoretical interest}: They all rely on non-black-box cryptographic techniques, which are highly impractical. This work shows that non-black-box techniques are not necessary for basic laconic cryptography primitives. We propose a \emph{completely algebraic} construction of laconic encryption, a notion that we introduce in this work, which serves as the cornerstone of our framework. We prove that the scheme is secure under the standard Learning With Errors assumption (with polynomial modulus-to-noise ratio). We provide proof-of-concept implementations for the first time for laconic primitives, demonstrating the construction is indeed practical: For a database size of $2^{50}$, encryption and decryption are in the order of single digit \emph{milliseconds}. Laconic encryption can be used as a black box to construct other laconic primitives. Specifically, we show how to construct: \begin{itemize} \item Laconic oblivious transfer \item Registration-based encryption scheme \item Laconic private-set intersection protocol \end{itemize} All of the above have essentially optimal parameters and similar practical efficiency. Furthermore, our laconic encryption can be preprocessed such that the online encryption step is entirely combinatorial and therefore much more efficient. Using similar techniques, we also obtain identity-based encryption with an unbounded identity space and tight security proof (in the standard model).
2023
CRYPTO
A Framework for Statistically Sender Private OT with Optimal Rate
Statistical sender privacy (SSP) is the strongest achievable security notion for two-message oblivious transfer (OT) in the standard model, providing statistical security against malicious receivers and computational security against semi-honest senders. In this work we provide a novel construction of SSP OT from the Decisional Diffie-Hellman (DDH) and the Learning Parity with Noise (LPN) assumptions achieving (asymptotically) optimal amortized communication complexity, i.e. it achieves rate 1. Concretely, the total communication complexity for $k$ OT instances is $2k(1+o(1))$, which (asymptotically) approaches the information-theoretic lower bound. Previously, it was only known how to realize this primitive using heavy rate-1 FHE techniques [Brakerski et al., Gentry and Halevi TCC'19]. At the heart of our construction is a primitive called statistical co-PIR, essentially a a public key encryption scheme which statistically erases bits of the message in a few hidden locations. Our scheme achieves nearly optimal ciphertext size and provides statistical security against malicious receivers. Computational security against semi-honest senders holds under the DDH assumption.
2023
JOFC
Candidate iO from Homomorphic Encryption Schemes
We propose a new approach to construct general-purpose indistinguishability obfuscation (iO). Our construction is obtained via a new intermediate primitive that we call split fully homomorphic encryption (split FHE), which we show to be sufficient for constructing iO. Specifically, split FHE is FHE where decryption takes the following two-step syntactic form: (i) a secret decryption step that uses the secret key and produces a hint which is (asymptotically) shorter than the length of the encrypted message, and (ii) a public decryption step that only requires the ciphertext and the previously generated hint (and not the entire secret key) and recovers the encrypted message. In terms of security, the hints for a set of ciphertexts should not allow one to violate semantic security for any other ciphertexts. Next, we show a generic candidate construction of split FHE based on three building blocks: (i) A standard FHE scheme with linear decrypt-and-multiply (which can be instantiated with essentially all LWE-based constructions), (ii) a linearly homomorphic encryption scheme with short decryption hints (such as the Damgård-Jurik encryption scheme, based on the DCR problem), and (iii) a cryptographic hash function (which can be based on a variety of standard assumptions). Our approach is heuristic in the sense that our construction is not provably secure and makes implicit assumptions about the interplay between these underlying primitives. We show evidence that this construction is secure by providing an argument in an appropriately defined oracle model. We view our construction as a big departure from the state-of-the-art constructions, and it is in fact quite simple.
2022
PKC
Two-Round Oblivious Linear Evaluation from Learning with Errors 📺
Pedro Branco Nico Döttling Paulo Mateus
Oblivious Linear Evaluation (OLE) is the arithmetic analogue of the well-know oblivious transfer primitive. It allows a sender, holding an affine function $f(x)=a+bx$ over a finite field or ring, to let a receiver learn $f(w)$ for a $w$ of the receiver's choice. In terms of security, the sender remains oblivious of the receiver's input $w$, whereas the receiver learns nothing beyond $f(w)$ about $f$. In recent years, OLE has emerged as an essential building block to construct efficient, reusable and maliciously-secure two-party computation. In this work, we present efficient two-round protocols for OLE over large fields based on the Learning with Errors (LWE) assumption, providing a full arithmetic generalization of the oblivious transfer protocol of Peikert, Vaikuntanathan and Waters (CRYPTO 2008). At the technical core of our work is a novel extraction technique which allows to determine if a non-trivial multiple of some vector is close to a $q$-ary lattice.
2022
EUROCRYPT
Batch-OT with Optimal Rate 📺
We show that it is possible to perform $n$ independent copies of $1$-out-of-$2$ oblivious transfer in two messages, where the communication complexity of the receiver and sender (each) is $n(1+o(1))$ for sufficiently large $n$. Note that this matches the information-theoretic lower bound. Prior to this work, this was only achievable by using the heavy machinery of rate-$1$ fully homomorphic encryption (Rate-$1$ FHE, Brakerski et al., TCC 2019). To achieve rate-$1$ both on the receiver's and sender's end, we use the LPN assumption, with slightly sub-constant noise rate $1/m^{\epsilon}$ for any $\epsilon>0$ together with either the DDH, QR or LWE assumptions. In terms of efficiency, our protocols only rely on linear homomorphism, as opposed to the FHE-based solution which inherently requires an expensive ``bootstrapping'' operation. We believe that in terms of efficiency we compare favorably to existing batch-OT protocols, while achieving superior communication complexity. We show similar results for Oblivious Linear Evaluation (OLE). For our DDH-based solution we develop a new technique that may be of independent interest. We show that it is possible to ``emulate'' the binary group $\bbZ_2$ (or any other small-order group) inside a prime-order group $\bbZ_p$ \emph{in a function-private manner}. That is, $\bbZ_2$ operations are mapped to $\bbZ_p$ operations such that the outcome of the latter do not reveal additional information beyond the $\bbZ_2$ outcome. Our encoding technique uses the discrete Gaussian distribution, which to our knowledge was not done before in the context of DDH.
2022
ASIACRYPT
Universal Ring Signatures in the Standard Model 📺
Pedro Branco Nico Döttling Stella Wohnig
Ring signatures allow a user to sign messages on behalf of an \emph{ad hoc} set of users - a ring - while hiding her identity. The original motivation for ring signatures was whistleblowing [Rivest et al. ASIACRYPT'01]: a high government employee can anonymously leak sensitive information while certifying that it comes from a reliable source, namely by signing the leak. However, essentially all known ring signature schemes require the members of the ring to publish a structured verification key that is compatible with the scheme. This creates somewhat of a paradox since, if a user does not want to be framed for whistleblowing, they will stay clear of signature schemes that support ring signatures. In this work, we formalize the concept of universal ring signatures (URS). A URS enables a user to issue a ring signature with respect to a ring of users, independently of the signature schemes they are using. In particular, none of the verification keys in the ring need to come from the same scheme. Thus, in principle, URS presents an effective solution for whistleblowing. The main goal of this work is to study the feasibility of URS, especially in the standard model (i.e. no random oracles or common reference strings). We present several constructions of URS, offering different trade-offs between assumptions required, the level of security achieved, and the size of signatures: \begin{itemize} \item Our first construction is based on superpolynomial hardness assumptions of standard primitives. It achieves compact signatures. That means the size of a signature depends only logarithmically on the size of the ring and on the number of signature schemes involved. \item We then proceed to study the feasibility of constructing URS from standard polynomially-hard assumptions only. We construct a non-compact URS from witness encryption and additional standard assumptions. \item Finally, we show how to modify the non-compact construction into a compact one by relying on indistinguishability obfuscation. \end{itemize}
2022
TCC
IBE with Incompressible Master Secret and Small Identity Secrets
Side-stepping the protection provided by cryptography, exfiltration attacks are becoming a considerable real-world threat. With the goal of mitigating the exfiltration of cryptographic keys, big-key cryptosystems have been developed over the past few years. These systems come with very large secret keys which are thus hard to exfiltrate. Typically, in such systems, the setup time must be large as it generates the large secret key. However, subsequently, the encryption and decryption operations, that must be performed repeatedly, are required to be efficient. Specifically, the encryption uses only a small public key and the decryption only accesses small ciphertext-dependent parts of the full secret key. Nonetheless, these schemes require decryption to have access to the entire secret key. Thus, using such big-key cryptosystems necessitate that users carry around large secret keys on their devices, which can be a hassle and in some cases might also render exfiltration easy. With the goal of removing this problem, in this work, we initiate the study of big-key identity-based encryption (bk-IBE). In such a system, the master secret key is allowed to be large but we require that the identity-based secret keys are short. This allows users to use the identity-based short keys as the ephemeral secret keys that can be more easily carried around and allow for decrypting ciphertexts matching a particular identity, e.g. messages that were encrypted on a particular date. In particular: -We build a new definitional framework for bk-IBE capturing a range of applications. In the case when the exfiltration is small our definition promises stronger security --- namely, an adversary can break semantic security for only a few identities, proportional to the amount of leakage it gets. In contrast, in the catastrophic case where a large fraction of the master secret key has been ex-filtrated, we can still resort to a guarantee that the ciphertexts generated for a randomly chosen identity (or, an identity with enough entropy) remain protected. We demonstrate how this framework captures the best possible security guarantees. -We show the first construction of such a bk-IBE offering strong security properties. Our construction is based on standard assumptions on groups with bilinear pairings and brings together techniques from seemingly different contexts such as leakage resilient cryptography, reusable two-round MPC, and laconic oblivious transfer. We expect our techniques to be of independent interest.
2022
TCC
Rate-1 Incompressible Encryption from Standard Assumptions
Pedro Branco Nico Döttling Jesko Dujmović
Incompressible encryption, recently proposed by Guan, Wichs and Zhandry (EUROCRYPT'22), is a novel encryption paradigm geared towards providing strong long-term security guarantees against adversaries with \emph{bounded long-term memory}. Given that the adversary forgets just a small fraction of a ciphertext, this notion provides strong security for the message encrypted therein, even if at some point in the future the entire secret key is exposed. This comes at the price of having potentially very large ciphertexts. Thus, an important efficiency measure for incompressible encryption is the message-to-ciphertext ratio (also called the rate). Guan et al. provided a low-rate instantiation of this notion from standard assumptions, and a rate-1 instantiation from indistinguishability obfuscation (iO). In this work, we propose a simple framework to build rate-1 incompressible encryption from standard assumptions. Our construction can be realized from e.g. the DDH and additionally the DCR or the LWE assumptions.
2021
PKC
Universal Proxy Re-Encryption 📺
Nico Dottling Ryo Nishimaki
We put forward the notion of universal proxy re-encryption (UPRE). A UPRE scheme enables a proxy to convert a ciphertext under a (delegator) public key of any existing public-key encryption (PKE) scheme into another ciphertext under a (delegatee) public key of any existing PKE scheme (possibly different from the delegator one). The proxy has a re-encryption key generated from the delegator's secret key and the delegatee public key. Thus UPRE generalizes proxy re-encryption by supporting arbitrary PKE schemes and allowing to convert ciphertexts into ones of possibly different PKE schemes. In this work, we - provide syntax and definitions for both UPRE and a variant we call relaxed UPRE. The relaxed variant means that decryption algorithms for re-encrypted ciphertexts are slightly modified but still only use the original delegatee secret keys for decryption. - construct a UPRE based on probabilistic indistinguishability obfuscation (PIO). It allows us to re-encrypt ciphertexts polynomially many times. - construct relaxed UPRE from garbled circuits (GCs). We provide two variants of this construction, one which allows us to re-encrypt ciphertexts polynomially many times, and a second one which satisfies a stronger security requirement but only allows us to re-encrypt ciphertexts a constant number of times.
2021
PKC
Multiparty Cardinality Testing for Threshold Private Set Intersection 📺
Pedro Branco Nico Döttling Sihang Pu
Threshold Private Set Intersection (PSI) allows multiple parties to compute the intersection of their input sets if and only if the intersection is larger than $n-t$, where $n$ is the size of the sets and $t$ is some threshold. The main appeal of this primitive is that, in contrast to standard PSI, known upper-bounds on the communication complexity only depend on the threshold $t$ and not on the sizes of the input sets. Current Threshold PSI protocols split themselves into two components: A Cardinality Testing phase, where parties decide if the intersection is larger than some threshold; and a PSI phase, where the intersection is computed. The main source of inefficiency of Threshold PSI is the former part. In this work, we present a new Cardinality Testing protocol that allows $N$ parties to check if the intersection of their input sets is larger than $n-t$. The protocol incurs in $\tilde{ \mathcal{O}} (Nt^2)$ communication complexity. We thus obtain a Threshold PSI scheme for $N$ parties with communication complexity $\tilde{ \mathcal{O}}(Nt^2)$.
2021
TCC
On the Impossibility of Purely Algebraic Signatures 📺
The existence of one-way functions implies secure digital sig- natures, but not public-key encryption (at least in a black-box setting). Somewhat surprisingly, though, efficient public-key encryption schemes appear to be much easier to construct from concrete algebraic assumptions (such as the factoring of Diffie-Hellman-like assumptions) than efficient digital signature schemes. In this work, we provide one reason for this apparent difficulty to construct efficient signature schemes. Specifically, we prove that a wide range of algebraic signature schemes (in which verification essentially checks a number of linear equations over a group) fall to conceptually surprisingly simple linear algebra attacks. In fact, we prove that in an algebraic signature scheme, sufficiently many signatures can be linearly combined to a signature of a fresh message. We present attacks both in known-order and hidden-order groups (although in hidden-order settings, we have to restrict our definition of algebraic signatures a little). More explicitly, we show: – the insecurity of all algebraic signature schemes in Maurer’s generic group model, as long as the signature schemes do not rely on other cryptographic assumptions, such as hash functions. – the insecurity of a natural class of signatures in hidden-order groups, where verification consists of linear equations over group elements. We believe that this highlights the crucial role of public verifiability in digital signature schemes. Namely, while public-key encryption schemes do not require any publicly verifiable structure on ciphertexts, it is exactly this structure on signatures that invites attacks like ours and makes it hard to construct efficient signatures.
2021
TCC
Laconic Private Set Intersection and Applications 📺
Consider a server with a \emph{large} set $S$ of strings $\{x_1,x_2\ldots,x_N\}$ that would like to publish a \emph{small} hash $h$ of its set $S$ such that any client with a string $y$ can send the server a \emph{short} message allowing it to learn $y$ if $y \in S$ and nothing otherwise. In this work, we study this problem of two-round private set intersection (PSI) with low (asymptotically optimal) communication cost, or what we call \emph{laconic} private set intersection ($\ell$PSI) and its extensions. This problem is inspired by the recent general frameworks for laconic cryptography [Cho et al. CRYPTO 2017, Quach et al. FOCS'18]. We start by showing the first feasibility result for realizing $\ell$PSI~ based on the CDH assumption, or LWE with polynomial noise-to-modulus ratio. However, these feasibility results use expensive non-black-box cryptographic techniques leading to significant inefficiency. Next, with the goal of avoiding these inefficient techniques, we give a construction of $\ell$PSI~schemes making only black-box use of cryptographic functions. Our construction is secure against semi-honest receivers, malicious senders and reusable in the sense that the receiver's message can be reused across any number of executions of the protocol. The scheme is secure under the $\phi$-hiding, decisional composite residuosity and subgroup decision assumptions. Finally, we show natural applications of $\ell$PSI~to realizing a semantically-secure encryption scheme that supports detection of encrypted messages belonging to a set of ``illegal'' messages (e.g., an illegal video) circulating online. Over the past few years, significant effort has gone into realizing laconic cryptographic protocols. Nonetheless, our work provides the first black-box constructions of such protocols for a natural application setting.
2021
TCC
Rate-1 Quantum Fully Homomorphic Encryption 📺
Secure function evaluation (SFE) allows Alice to publish an encrypted version of her input m such that Bob (holding a circuit C) can send a single message that reveals C(m) to Alice, and nothing more. Security is required to hold against malicious parties, that may behave arbitrarily. In this work we study the notion of SFE in the quantum setting, where Alice outputs an encrypted quantum state |\psi> and learns C(|\psi>) after receiving Bob's message. We show that, assuming the quantum hardness of the learning with errors problem (LWE), there exists an SFE protocol for quantum computation with communication complexity (||\psi>|+|C(|\psi>)|)(1+o(1)), which is nearly optimal. This result is obtained by two main technical steps, which might be of independent interest. Specifically, we show (i) a construction of a rate-1 quantum fully-homomorphic encryption and (ii) a generic transformation to achieve malicious circuit privacy in the quantum setting.
2020
EUROCRYPT
Two-Round Oblivious Transfer from CDH or LPN 📺
We show a new general approach for constructing maliciously-secure two-round oblivious transfer (OT). Specifically, we provide a generic sequence of transformations to upgrade a very basic notion of two-roundOT, which we call elementary OT, to UC-secure OT. We then give simple constructions of elementary OT under the Computational Diffie-Hellman(CDH) assumption or the Learning Parity with Noise (LPN) assumption, yielding the first constructions of malicious (UC-secure) two-round OT under these assumptions. Since two-round OT is complete for two-round 2-party and multi-party computation in the malicious setting, we also achieve the first constructions of the latter under these assumptions.
2020
EUROCRYPT
Hardness of LWE on General Entropic Distributions 📺
Zvika Brakerski Nico Döttling
The hardness of the Learning with Errors (LWE) problem is by now a cornerstone of the cryptographic landscape, allowing to con- struct cryptographic schemes with properties unknown under other as- sumptions, and being conjectured to be resilient to quantum attacks. LWE is essentially the task of solving a noisy system of random linear equations over uniformly random secret variables (“the LWE secret”), evaluated modulo some integer. In applications the secret variables usu- ally correspond to the secret key of the cryptographic scheme. It is therefore of great importance to understand what happens when the secret variables are not sampled uniformly (but still have some entropy). This is relevant for settings where an adversary manages to obtain partial information on the secret (a.k.a key leakage), for various theoretical ap- plications, and also for practical use where for efficiency or convenience it is easier to sample the secret from some non-uniform distribution. This so called “Entropic LWE” problem has been studied in a number of works, starting with Goldwasser et al. (ICS 2010). However, so far it was only known how to prove the hardness of Entropic LWE for secret distributions supported inside a ball of small radius. In this work we resolve the hardness of Entropic LWE with arbitrary long secrets, in the following sense. We show an entropy bound that guarantees the security of arbitrary Entropic LWE. This bound is higher than what is required in the ball-bounded setting, but we show that this is essentially tight. Tightness is shown unconditionally for highly-composite moduli, and using black-box impossibility for arbitrary moduli. Technically, we show that the entropic hardness of LWE relies on a sim- ple to describe lossiness property of the distribution of secrets itself. This is simply the probability of recovering a random sample from this distri- bution s, given s + e, where e is Gaussian noise (i.e. the quality of the distribution of secrets as an error correcting code for Gaussian noise). We hope that this characterization will make it easier to derive entropic LWE results more easily in the future. We also use our techniques to show new results for the ball-bounded setting, essentially showing that under a strong enough assumption even polylogarithmic entropy suffices.
2020
EUROCRYPT
Candidate iO From Homomorphic Encryption Schemes 📺
We propose a new approach to construct general-purpose indistinguishability obfuscation (iO). Our construction is obtained via a new intermediate primitive that we call split fully-homomorphic encryption (split FHE), which we show to be sufficient for constructing iO. Specifically, split FHE is FHE where decryption takes the following two-step syntactic form: (i) A secret decryption step uses the secret key and produces a hint which is (asymptotically) shorter than the length of the encrypted message, and (ii) a public decryption step that only requires the ciphertext and the previously generated hint (and not the entire secret key), and recovers the encrypted message. In terms of security, the hints for a set of ciphertexts should not allow one to violate semantic security for any other ciphertexts. Next, we show a generic candidate construction of split FHE based on three building blocks: (i) A standard FHE scheme with linear decrypt-and-multiply (which can be instantiated with essentially all LWE-based constructions), (ii) a linearly homomorphic encryption scheme with short decryption hints (such as the Damgard-Jurik encryption scheme, based on the DCR problem), and (iii) a cryptographic hash function (which can be based on a variety of standard assumptions). Our approach is heuristic in the sense that our construction is not provably secure and makes implicit assumptions about the interplay between these underlying primitives. We show evidence that this construction is secure by providing an argument in an appropriately defined oracle model. We view our construction as a big departure from the state-of-the-art constructions, and it is in fact quite simple.
2020
TCC
Constant Ciphertext-Rate Non-Committing Encryption from Standard Assumptions 📺
Non-committing encryption (NCE) is a type of public key encryption which comes with the ability to equivocate ciphertexts to encryptions of arbitrary messages, i.e., it allows one to find coins for key generation and encryption which ``explain'' a given ciphertext as an encryption of any message. NCE is the cornerstone to construct adaptively secure multiparty computation [Canetti et al. STOC'96] and can be seen as the quintessential notion of security for public key encryption to realize ideal communication channels. A large body of literature investigates what is the best message-to-ciphertext ratio (i.e., the rate) that one can hope to achieve for NCE. In this work we propose a near complete resolution to this question and we show how to construct NCE with constant rate in the plain model from a variety of assumptions, such as the hardness of the learning with errors (LWE), the decisional Diffie-Hellman (DDH), or the quadratic residuosity (QR) problem. Prior to our work, constructing NCE with constant rate required a trusted setup and indistinguishability obfuscation [Canetti et al. ASIACRYPT'17].
2020
TCC
Lossiness and Entropic Hardness for Ring-LWE 📺
Zvika Brakerski Nico Döttling
The hardness of the Ring Learning with Errors problem (RLWE) is a central building block for efficiency-oriented lattice-based cryptography. Many applications use an ``entropic'' variant of the problem where the so-called ``secret'' is not distributed uniformly as prescribed but instead comes from some distribution with sufficient min-entropy. However, the hardness of the entropic variant has not been substantiated thus far. For standard LWE (not over rings) entropic results are known, using a ``lossiness approach'' but it was not known how to adapt this approach to the ring setting. In this work we present the first such results, where entropic security is established either under RLWE or under the Decisional Small Polynomial Ratio (DSPR) assumption which is a mild variant of the NTRU assumption. In the context of general entropic distributions, our results in the ring setting essentially match the known lower bounds (Bolboceanu et al., Asiacrypt 2019; Brakerski and Döttling, Eurocrypt 2020).
2020
ASIACRYPT
A Combinatorial Approach to Quantum Random Functions 📺
Nico Döttling Giulio Malavolta Sihang Pu
Quantum pseudorandom functions (QPRFs) extend the classical security of a PRF by allowing the adversary to issue queries on input superpositions. Zhandry [Zhandry, FOCS 2012] showed a separation between the two notions and proved that common construction paradigms are also quantum secure, albeit with a new ad-hoc analysis. In this work, we revisit the question of constructing QPRFs and propose a new method starting from small-domain (classical) PRFs: At the heart of our approach is a new domain-extension technique based on bipartite expanders. Interestingly, our analysis is almost entirely classical. As a corollary of our main theorem, we obtain the first (approximate) key-homomorphic quantum PRF based on the quantum intractability of the learning with errors problem.
2019
EUROCRYPT
Continuous Non-Malleable Codes in the 8-Split-State Model 📺
Non-malleable codes (NMCs), introduced by Dziembowski, Pietrzak and Wichs [20], provide a useful message integrity guarantee in situations where traditional error-correction (and even error-detection) is impossible; for example, when the attacker can completely overwrite the encoded message. NMCs have emerged as a fundamental object at the intersection of coding theory and cryptography. In particular, progress in the study of non-malleable codes and the related notion of non-malleable extractors has led to new insights and progress on even more fundamental problems like the construction of multi-source randomness extractors. A large body of the recent work has focused on various constructions of non-malleable codes in the split-state model. Many variants of NMCs have been introduced in the literature, e.g., strong NMCs, super strong NMCs and continuous NMCs. The most general, and hence also the most useful notion among these is that of continuous non-malleable codes, that allows for continuous tampering by the adversary. We present the first efficient information-theoretically secure continuously non-malleable code in the constant split-state model. We believe that our main technical result could be of independent interest and some of the ideas could in future be used to make progress on other related questions.
2019
EUROCRYPT
Incremental Proofs of Sequential Work 📺
A proof of sequential work allows a prover to convince a verifier that a certain amount of sequential steps have been computed. In this work we introduce the notion of incremental proofs of sequential work where a prover can carry on the computation done by the previous prover incrementally, without affecting the resources of the individual provers or the size of the proofs.To date, the most efficient instance of proofs of sequential work [Cohen and Pietrzak, Eurocrypt 2018] for N steps require the prover to have $$\sqrt{N}$$N memory and to run for $$N + \sqrt{N}$$N+N steps. Using incremental proofs of sequential work we can bring down the prover’s storage complexity to $$\log N$$logN and its running time to N.We propose two different constructions of incremental proofs of sequential work: Our first scheme requires a single processor and introduces a poly-logarithmic factor in the proof size when compared with the proposals of Cohen and Pietrzak. Our second scheme assumes $$\log N$$logN parallel processors but brings down the overhead of the proof size to a factor of 9. Both schemes are simple to implement and only rely on hash functions (modelled as random oracles).
2019
EUROCRYPT
Ring Signatures: Logarithmic-Size, No Setup—from Standard Assumptions 📺
Ring signatures allow for creating signatures on behalf of an ad hoc group of signers, hiding the true identity of the signer among the group. A natural goal is to construct a ring signature scheme for which the signature size is short in the number of ring members. Moreover, such a construction should not rely on a trusted setup and be proven secure under falsifiable standard assumptions. Despite many years of research this question is still open.In this paper, we present the first construction of size-optimal ring signatures which do not rely on a trusted setup or the random oracle heuristic. Specifically, our scheme can be instantiated from standard assumptions and the size of signatures grows only logarithmically in the number of ring members.We also extend our techniques to the setting of linkable ring signatures, where signatures created using the same signing key can be linked.
2019
CRYPTO
Trapdoor Hash Functions and Their Applications 📺
We introduce a new primitive, called trapdoor hash functions (TDH), which are hash functions $$\mathsf {H}: \{0,1\}^n \rightarrow \{0,1\}^\lambda $$ with additional trapdoor function-like properties. Specifically, given an index $$i\in [n]$$, TDHs allow for sampling an encoding key $$\mathsf {ek}$$ (that hides i) along with a corresponding trapdoor. Furthermore, given $$\mathsf {H}(x)$$, a hint value $$\mathsf {E}(\mathsf {ek},x)$$, and the trapdoor corresponding to $$\mathsf {ek}$$, the $$i^{th}$$ bit of x can be efficiently recovered. In this setting, one of our main questions is: How small can the hint value $$\mathsf {E}(\mathsf {ek},x)$$ be? We obtain constructions where the hint is only one bit long based on DDH, QR, DCR, or LWE.This primitive opens a floodgate of applications for low-communication secure computation. We mainly focus on two-message protocols between a receiver and a sender, with private inputs x and y, resp., where the receiver should learn f(x, y). We wish to optimize the (download) rate of such protocols, namely the asymptotic ratio between the size of the output and the sender’s message. Using TDHs, we obtain:1.The first protocols for (two-message) rate-1 string OT based on DDH, QR, or LWE. This has several useful consequences, such as:(a)The first constructions of PIR with communication cost poly-logarithmic in the database size based on DDH or QR. These protocols are in fact rate-1 when considering block PIR.(b)The first constructions of a semi-compact homomorphic encryption scheme for branching programs, where the encrypted output grows only with the program length, based on DDH or QR.(c)The first constructions of lossy trapdoor functions with input to output ratio approaching 1 based on DDH, QR or LWE.(d)The first constant-rate LWE-based construction of a 2-message “statistically sender-private” OT protocol in the plain model.2.The first rate-1 protocols (under any assumption) for n parallel OTs and matrix-vector products from DDH, QR or LWE. We further consider the setting where f evaluates a RAM program y with running time $$T\ll |x|$$ on x. We obtain the first protocols with communication sublinear in the size of x, namely $$T\cdot \sqrt{|x|}$$ or $$T\cdot \root 3 \of {|x|}$$, based on DDH or, resp., pairings (and correlated-input secure hash functions).
2019
TCC
Leveraging Linear Decryption: Rate-1 Fully-Homomorphic Encryption and Time-Lock Puzzles
We show how to combine a fully-homomorphic encryption scheme with linear decryption and a linearly-homomorphic encryption schemes to obtain constructions with new properties. Specifically, we present the following new results. (1)Rate-1 Fully-Homomorphic Encryption: We construct the first scheme with message-to-ciphertext length ratio (i.e., rate) $$1-\sigma $$ for $$\sigma = o(1)$$. Our scheme is based on the hardness of the Learning with Errors (LWE) problem and $$\sigma $$ is proportional to the noise-to-modulus ratio of the assumption. Our building block is a construction of a new high-rate linearly-homomorphic encryption.One application of this result is the first general-purpose secure function evaluation protocol in the preprocessing model where the communication complexity is within additive factor of the optimal insecure protocol.(2)Fully-Homomorphic Time-Lock Puzzles: We construct the first time-lock puzzle where one can evaluate any function over a set of puzzles without solving them, from standard assumptions. Prior work required the existence of sub-exponentially hard indistinguishability obfuscation.
2019
ASIACRYPT
Efficient UC Commitment Extension with Homomorphism for Free (and Applications)
Homomorphic universally composable (UC) commitments allow for the sender to reveal the result of additions and multiplications of values contained in commitments without revealing the values themselves while assuring the receiver of the correctness of such computation on committed values. In this work, we construct essentially optimal additively homomorphic UC commitments from any (not necessarily UC or homomorphic) extractable commitment, while the previous best constructions require oblivious transfer. We obtain amortized linear computational complexity in the length of the input messages and rate 1. Next, we show how to extend our scheme to also obtain multiplicative homomorphism at the cost of asymptotic optimality but retaining low concrete complexity for practical parameters. Moreover, our techniques yield public coin protocols, which are compatible with the Fiat-Shamir heuristic. These results come at the cost of realizing a restricted version of the homomorphic commitment functionality where the sender is allowed to perform any number of commitments and operations on committed messages but is only allowed to perform a single batch opening of a number of commitments. Although this functionality seems restrictive, we show that it can be used as a building block for more efficient instantiations of recent protocols for secure multiparty computation and zero knowledge non-interactive arguments of knowledge.
2019
ASIACRYPT
Rate-1 Trapdoor Functions from the Diffie-Hellman Problem
Trapdoor functions (TDFs) are one of the fundamental building blocks in cryptography. Studying the underlying assumptions and the efficiency of the resulting instantiations is therefore of both theoretical and practical interest. In this work we improve the input-to-image rate of TDFs based on the Diffie-Hellman problem. Specifically, we present: (a)A rate-1 TDF from the computational Diffie-Hellman (CDH) assumption, improving the result of Garg, Gay, and Hajiabadi [EUROCRYPT 2019], which achieved linear-size outputs but with large constants. Our techniques combine non-binary alphabets and high-rate error-correcting codes over large fields.(b)A rate-1 deterministic public-key encryption satisfying block-source security from the decisional Diffie-Hellman (DDH) assumption. While this question was recently settled by Döttling et al. [CRYPTO 2019], our scheme is conceptually simpler and concretely more efficient. We demonstrate this fact by implementing our construction.
2018
PKC
New Constructions of Identity-Based and Key-Dependent Message Secure Encryption Schemes
Recently, Döttling and Garg (CRYPTO 2017) showed how to build identity-based encryption (IBE) from a novel primitive termed Chameleon Encryption, which can in turn be realized from simple number theoretic hardness assumptions such as the computational Diffie-Hellman assumption (in groups without pairings) or the factoring assumption. In a follow-up work (TCC 2017), the same authors showed that IBE can also be constructed from a slightly weaker primitive called One-Time Signatures with Encryption (OTSE).In this work, we show that OTSE can be instantiated from hard learning problems such as the Learning With Errors (LWE) and the Learning Parity with Noise (LPN) problems. This immediately yields the first IBE construction from the LPN problem and a construction based on a weaker LWE assumption compared to previous works.Finally, we show that the notion of one-time signatures with encryption is also useful for the construction of key-dependent-message (KDM) secure public-key encryption. In particular, our results imply that a KDM-secure public key encryption can be constructed from any KDM-secure secret-key encryption scheme and any public-key encryption scheme.
2018
TCC
Two-Message Statistically Sender-Private OT from LWE
Zvika Brakerski Nico Döttling
We construct a two-message oblivious transfer (OT) protocol without setup that guarantees statistical privacy for the sender even against malicious receivers. Receiver privacy is game based and relies on the hardness of learning with errors (LWE). This flavor of OT has been a central building block for minimizing the round complexity of witness indistinguishable and zero knowledge proof systems, non-malleable commitment schemes and multi-party computation protocols, as well as for achieving circuit privacy for homomorphic encryption in the malicious setting. Prior to this work, all candidates in the literature from standard assumptions relied on number theoretic assumptions and were thus insecure in the post-quantum setting. This work provides the first (presumed) post-quantum secure candidate and thus allows to instantiate the aforementioned applications in a post-quantum secure manner.Technically, we rely on the transference principle: Either a lattice or its dual must have short vectors. Short vectors, in turn, can be translated to information loss in encryption. Thus encrypting one message with respect to the lattice and one with respect to its dual guarantees that at least one of them will be statistically hidden.
2017
EUROCRYPT
2017
CRYPTO
2017
CRYPTO
2017
TCC
2016
CRYPTO
2016
CRYPTO
2015
TCC
2015
PKC
2015
EUROCRYPT
2015
CRYPTO
2013
TCC
2013
EUROCRYPT
2012
ASIACRYPT
2011
TCC

Program Committees

Asiacrypt 2023
TCC 2023
Crypto 2022
Asiacrypt 2022
PKC 2022
TCC 2020
TCC 2019
PKC 2019
Crypto 2019
Asiacrypt 2018
Eurocrypt 2018
PKC 2018
Asiacrypt 2017
Crypto 2017
PKC 2017
Asiacrypt 2016
Eurocrypt 2016
Asiacrypt 2015
TCC 2015