International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2015-04-29
00:17 [Pub][ePrint] Privately Evaluating Decision Trees and Random Forests, by David J. Wu and Tony Feng and Michael Naehrig and Kristin Lauter

  Decision trees and random forests are common classifiers with widespread use. In this paper, we develop two protocols for privately evaluating decision trees and random forests. We operate in the standard two-party setting where the server holds a model (either a tree or a forest), and the client holds an input (a feature vector). At the conclusion of the protocol, the client learns only the model\'s output on its input and a few generic parameters concerning the model; the server learns nothing. The first protocol we develop provides security against semi-honest adversaries. Next, we show an extension of the semi-honest protocol that obtains one-sided security against malicious adversaries. We implement both protocols and show that both variants are able to process trees with several hundred decision nodes in just a few seconds and a modest amount of bandwidth. Compared to previous semi-honest protocols for private decision tree evaluation, we demonstrate tenfold improvements in computation and bandwidth.





2015-04-28
14:31 [Job][New] PhD student, University College London

  Applications are invited for a PhD position in the field of cryptography in the Information Security Group at the UCL Department of Computer Science, to be supervised by Dr. Sarah Meiklejohn. The position is funded partially by Microsoft Research (MSR) Cambridge and will be co-supervised by Dr. Markulf Kohlweiss, an MSR researcher.

The successful applicant will study the topic of controlled malleability in non-interactive zero-knowledge proofs, which has already been demonstrated to provide useful constructions such as compact verifiable shuffles, delegatable anonymous credentials, and more. The aim of this project is to improve on existing methods for constructing controlled-malleable primitives, both in terms of efficiency and in terms of the variety of cryptographic operations that can be performed, as well as to improve their usability and interoperability with existing cryptographic primitives.

We expect a candidate to have a strong degree in Computer Science, Mathematics, or a related MSc course. A good mathematics background and a willingness to become fluent with modern cryptographic constructions is necessary. The studentship is open to all, but fully covers only UK/EU fees.

05:20 [Event][New] SSR 2015: Security Standardisation Research 2015

  Submission: 26 June 2015
Notification: 10 September 2015
From December 15 to December 16
Location: Tokyo, Japan
More Information: http://ssr2015.com


00:17 [Pub][ePrint] Cluster Computing in Zero Knowledge, by Alessandro Chiesa and Eran Tromer and Madars Virza

  Large computations, when amenable to distributed parallel execution, are often executed on computer clusters, for scalability and cost reasons. Such computations are used in many applications, including, to name but a few, machine learning, webgraph mining, and statistical machine translation. Oftentimes, though, the input data is private and only the result of the computation can be published. Zero-knowledge proofs would allow, in such settings, to verify correctness of the output without leaking (additional) information about the input.

In this work, we investigate theoretical and practical aspects of *zero-knowledge proofs for cluster computations*. We design, build, and evaluate zero-knowledge proof systems for which:

(i) a proof attests to the correct execution of a cluster computation; and

(ii) generating the proof is itself a cluster computation that is similar in structure and complexity to the original one.

Concretely, we focus on MapReduce, an elegant and popular form of cluster computing.

Previous zero-knowledge proof systems can in principle prove a MapReduce computation\'s correctness, via a monolithic NP statement that reasons about all mappers, all reducers, and shuffling. However, it is not clear how to generate the proof for such monolithic statements via parallel execution by a distributed system. Our work demonstrates, by theory and implementation, that proof generation can be similar in structure and complexity to the original cluster computation.

Our main technique is a bootstrapping theorem for succinct non-interactive arguments of knowledge (SNARKs) that shows how, via recursive proof composition and Proof-Carrying Data, it is possible to transform any SNARK into a *distributed SNARK for MapReduce* which proves, piecewise and in a distributed way, the correctness of every step in the original MapReduce computation as well as their global consistency.





2015-04-24
21:17 [Pub][ePrint] Cryptography from Post-Quantum Assumptions, by Raza Ali Kazmi

  In this thesis we present our contribution in the field of post-quantum cryptography.

We introduce a new notion of {\\em weakly Random-Self-Reducible} public-key cryptosystem and show how it can be used to implement secure Oblivious Transfer.

We also show that two recent (Post-quantum) cryptosystems can be considered as

{\\em weakly Random-Self-Reducible}. We introduce a new problem called Isometric Lattice

Problem and reduce graph isomorphism and linear code

equivalence to this problem. We also show that this problem has a perfect zero-knowledge interactive proof with respect to a malicious verifier;

this is the only hard problem in lattices that is known to have this property.



18:17 [Pub][ePrint] Bounds on surmising remixed keys, by Daniel R. L. Brown

  A remixed key is derived from a secret source key by applying a

public but unpredictable random function to the source key. A

remixed key models a key derived from a shared secret and a public

unpredictable salt, using a common, deterministic, pseudorandom

function---which is somewhat like TLS record-layer keys.

This report tries to validate the intuition that remixed keys are not

easy to surmise, in other words, that remixing does not introduce an

exploitable spike in the probability distribution of the remixed

key. The report provides pencil-and-paper proofs of numerical

bounds on the probability that an adversary can surmise a remixed

key, assuming a uniformly random source key and remix function. The

proofs are derived from a proof of an asymptotic result on

probability theory in a textbook by Shoup.



18:00 [Job][New] Post-Doctoral Fellowships in Lattice-Based Cryptography, Ecole Normale Superieure de Lyon

  We are seeking candidates for 2 post-doctoral research positions, in the areas of lattice-based cryptography and lattice algorithms. The positions are for up to 3 years.

Interested applicants should provide a detailed resume and references, before June 15th, 2015.



03:17 [Pub][ePrint] Security Analysis of PRINCE, by Jeremy Jean and Ivica Nikolic and Thomas Peyrin and Lei Wang and Shuang Wu

  In this article, we provide the first third-party security analysis of the

PRINCE lightweight block cipher, and the underlying PRINCE_core. First, while

no claim was made by the authors regarding related-key attacks, we show that

one can attack the full cipher with only a single pair of related keys, and

then reuse the same idea to derive an attack in the single-key model for the

full PRINCE_core for several instances of the $\\alpha$ parameter (yet not the

one randomly chosen by the designers). We also show how to exploit the

structural linear relations that exist for PRINCE in order to obtain a key

recovery attack that slightly breaks the security claims for the full cipher.

We analyze the application of integral attacks to get the best known

key-recovery attack on a reduced version of the PRINCE cipher. Finally, we

provide time-memory-data tradeoffs, that require only known

plaintext-ciphertext data, and that can be applied to full PRINCE.



03:17 [Pub][ePrint] Publicly Verifiable Software Watermarking, by Aloni Cohen and Justin Holmgren and Vinod Vaikuntanathan

  Software Watermarking is the process of transforming a program into a functionally equivalent \"marked\" program in such a way that it is computationally hard to remove the mark without destroying functionality. Barak, Goldreich, Impagliazzo, Rudich, Sahai, Vadhan and Yang (CRYPTO 2001) defined software watermarking and showed that the existence of indistinguishability obfuscation implies that software watermarking is impossible. Given the recent candidate constructions of indistinguishability obfuscation, this result paints a bleak picture for the possibility of meaningful watermarking.

We show that slightly relaxing the functionality requirement gives us strong positive results for watermarking. Namely, instead of requiring the marked program to agree with the original unmarked program on all inputs, we require only that they agree on a large fraction of inputs. With this relaxation in mind, our contributions are as follows.

1. We define publicly verifiable watermarking where marking a program requires a secret key, but anyone can verify that a program is marked. The handful of existing watermarking schemes are secretly verifiable, and moreover, satisfy only a weak definition where the ad- versary is restricted in the type of unmarked programs it is allowed to produce (Naccache, Shamir and Stern, PKC 1999; Nishimaki, EUROCRYPT 2013). Moreover, our definition requires security against chosen program attacks, where an adversary has access to an oracle that marks programs of her choice.

2. We construct a publicly verifiable watermarking scheme for any family of puncturable pseudo-random functions (PPRF), assuming indistinguishability obfuscation and injective one-way functions.

We also give an indication of the limits of watermarking by showing that the existence of robust totally unobfuscatable families of functions rules out a general watermarking scheme for cryptographic functionalities such as signatures and MACs.



03:17 [Pub][ePrint] On the Impossibility of Tight Cryptographic Reductions, by Christoph Bader and Tibor Jager and Yong Li and Sven Sch├Ąge

  The existence of tight reductions in cryptographic security proofs is an important question, motivated by the theoretical search for cryptosystems whose security guarantees are truly independent of adversarial behavior and the practical necessity of concrete security bounds for the theoretically-sound selection of cryptographic parameters.

At Eurocrypt 2002, Coron described a meta-reduction technique that allows to prove the impossibility of tight reductions for certain digital signature schemes.

This seminal result has found many further interesting applications.

However, due to a technical subtlety in the argument, the applicability of this technique beyond digital signatures in the single-user setting has turned out to be rather limited.

We describe a new meta-reduction technique for proving such impossibility results, which improves on known ones in several ways.

First, it enables interesting novel applications. This includes a formal proof that for certain cryptographic primitives (including public-key encryption/key encapsulation mechanisms and digital signatures), the security loss incurred when the primitive is transferred from an idealized single-user setting to the more realistic multi-user setting is impossible to avoid, and a lower tightness bound for non-interactive key exchange protocols. Second, the technique allows to rule out tight reductions from a very general class of non-interactive complexity assumptions. Third, the provided bounds are quantitatively and qualitatively better, yet simpler, than the bounds derived from Coron\'s technique and its extensions.





2015-04-23
15:17 [Pub][ePrint] Succinct Randomized Encodings and their Applications, by Nir Bitansky and Sanjam Garg and Huijia Lin and Rafael Pass and Sidharth Telang

  A {\\em randomized encoding} allows to express a ``complex\'\' computation, given by a function $f$ and input $x$, by a ``simple to compute\'\' randomized representation $\\hat{f}(x)$ whose distribution encodes $f(x)$, while revealing nothing else regarding $f$ and $x$. Existing randomized encodings, geared mostly to allow encoding with low parallel-complexity, have proven instrumental in various strong applications such as multiparty computation and parallel cryptography.

This work focuses on another natural complexity measure: {\\em the time

required to encode}. We construct {\\em succinct randomized

encodings} where the time to encode a computation, given by a

program $\\Pi$ and input $x$, is essentially independent of $\\Pi$\'s

time complexity, and only depends on its space complexity, as well as

the size of its input, output, and description. The scheme guarantees

computational privacy of $(\\Pi,x)$, and is based on

indistinguishability obfuscation for a relatively simple circuit

class, for which there exist instantiations based on polynomial

hardness assumptions on multi-linear maps.

We then invoke succinct randomized encodings to obtain several strong applications, including:

\\begin{itemize}

\\item

Succinct indistinguishability obfuscation, where the obfuscated program $iO({\\Pi})$ computes the same function as $\\Pi$ for inputs $x$ of apriori-bounded size. Obfuscating $\\Pi$ is roughly as fast as encoding the computation of $\\Pi$ on any such input $x$. Here we also require subexponentially-secure indistinguishability obfuscation for circuits.

\\item

Succinct functional encryption, where a functional decryption key corresponding to $\\Pi$ allows decrypting $\\Pi(x)$ from encryptions of any plaintext $x$ of apriori-bounded size. Key derivation is as fast as encoding the corresponding computation.

\\item

Succinct reusable garbling, a stronger form of randomized encodings where any number of inputs $x$ can be encoded separately of $\\Pi$, independently of $\\Pi$\'s time and space complexity.

\\item

Publicly-verifiable 2-message delegation where verifying the result of a long computation given by $\\Pi$ and input $x$ is as fast as encoding the corresponding computation. We also show how to transform any 2-message delegation scheme to an essentially non-interactive system where the verifier message is reusable.

\\end{itemize}

Previously, succinct randomized encodings or any of the above applications were only known based on various non-standard knowledge assumptions.

At the heart of our techniques is a generic method of compressing a

piecemeal garbled computation, without revealing anything about the

secret randomness utilized for garbling.