International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also receive updates via:

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2015-04-29
00:17 [Pub][ePrint] Financial Cryptography: Algorithmic Mechanisms for a Hedonic Game, by Sumit Chakraborty

  A (or a group of) selling agent wants to allocate and sell a (or a set of) parcel of land optimally and fairly to a buying agent within the capacity constraint of the selling agent and budget constraint of the buying agent. This problem has been solved by combining the concept of algorithmic cooperative game theory and financial cryptography. This is an approach for a group of decision-making agents to reach a mutually beneficial agreement through compromise and stable matching of preference. The work presents a cooperative game and a set of algorithmic coordination mechanisms: SBSS, SBMS (for collective and non-collective bargaining in holdout problem) and MBSS. The game is characterized by a set of agents, inputs, strategic moves, revelation principle, payment function and outputs. The coordination mechanisms are designed based on domain planning, rational fair data exchange and compensation negotiation. These mechanisms preserve the privacy of strategic data through secure multi-party computation (SMC), more specifically solving Yao\'s millionaire problem. The mechanisms are analyzed from the perspectives of revelation principle, computational intelligence and communication complexity. The communication complexity depends on the time constraint of the negotiating agents, their information state and the number of negotiation issues. The computational complexity depends on the valuation of pricing plan, compensation estimation and private comparison. It is a mixed strategy game; both sequential and simultaneous moves can be applied intelligently to search a neighborhood space of core solutions.



00:17 [Pub][ePrint] Speed Records for Ideal Lattice-Based Cryptography on AVR, by Thomas Pöppelmann and Tobias Oder and Tim Güneysu

  Over the last years lattice-based cryptography has received much attention due to versatile average-case problems like Ring-LWE or Ring-SIS that appear to be intractable by quantum computers. But despite of promising constructions, only few results have been published on implementation issues on very constrained platforms. In this work we therefore study and compare implementations of Ring-LWE encryption and the Bimodal Lattice Signature Scheme (BLISS) on an 8-bit Atmel ATxmega128 microcontroller. Since the number theoretic transform (NTT) is one of the core components in implementations of lattice-based cryptosystems, we review the application of the NTT in previous implementations and present an improved approach that significantly lowers the runtime for polynomial multiplication. Our implementation of Ring-LWE encryption takes 41 ms for encryption and 12 ms for decryption. To compute a BLISS signature, our software takes 316 ms and 111 ms for verification. These results outperform implementations on similar platforms and underline the feasibility of lattice-based cryptography on constrained devices.



00:17 [Pub][ePrint] Impossibility of VBB Obfuscation with Ideal Constant-Degree Graded Encodings, by Rafael Pass and abhi shelat

  A celebrated result by Barak et al (JACM\'12) shows the impossibility of general-purpose virtual black-box (VBB) obfuscation in the plain model. A recent work by Canetti, Kalai, and Paneth (TCC\'15) extends this result also to the random oracle model (assuming trapdoor per- mutations).

In contrast, Brakerski-Rothblum (TCC\'15) and Barak et al (EuroCrypt\'14) show that in idealized graded encoding models, general-purpose VBB obfuscation indeed is possible; these construction require graded encoding schemes that enable evaluating high-degree (polynomial in the size of the circuit to be obfuscated) polynomials on encodings.

We show a complementary impossibility of general-purpose VBB obfuscation in idealized graded encoding models that enable only evaluation of constant-degree polynomials (assuming trapdoor permutations).



00:17 [Pub][ePrint] Condensed Unpredictability, by Maciej Skorski and Alexander Golovnev and Krzysztof Pietrzak

  We consider the task of deriving a key with high HILL

entropy (i.e., being computationally indistinguishable from

a key with high min-entropy) from an unpredictable source.

Previous to this work, the only known way to transform unpredictability into

a key that was $\\eps$ indistinguishable from having min-entropy was via

pseudorandomness, for example by Goldreich-Levin (GL) hardcore bits.

This approach has the inherent limitation that from a source with $k$ bits of unpredictability entropy one can derive a key of length (and thus HILL entropy)

at most $k-2\\log(1/\\epsilon)$ bits. In many settings, e.g. when dealing with biometric data, such a $2\\log(1/\\epsilon)$ bit entropy loss in not an option.

Our main technical contribution is a theorem that states that in the high entropy regime, unpredictability implies HILL entropy.

Concretely, any variable $K$ with $|K|-d$ bits of unpredictability entropy has the same amount of so called

metric entropy (against real-valued, deterministic distinguishers), which is known to imply the same amount of HILL entropy.

The loss in circuit size in this argument is exponential in the entropy gap $d$, and thus this result only applies for small $d$ (i.e., where the

size of distinguishers considered is exponential in $d$).

To overcome the above restriction, we investigate if it\'s possible to first ``condense\'\' unpredictability entropy and make the entropy gap small. We show that any source with

$k$ bits of unpredictability can be condensed into a source of length $k$ with $k-3$ bits of unpredictability entropy.

Our condenser simply ``abuses\" the GL construction and derives a $k$ bit key from a source with $k$ bits of unpredicatibily. The original GL theorem

implies nothing when extracting

that many bits, but we show that in this regime, GL still behaves like a ``condenser\" for unpredictability.

This result comes with two caveats (1) the loss in circuit size is exponential in $k$ and (2) we require that the source we start with has \\emph{no} HILL entropy (equivalently, one can efficiently check if a guess is correct). We leave it as an intriguing open problem to

overcome these restrictions or to prove they\'re inherent.



00:17 [Pub][ePrint] Feasibility and Completeness of Cryptographic Tasks in the Quantum World, by Serge Fehr and Jonathan Katz and Fang Song and Hong-Sheng Zhou and Vassilis Zikas

  It is known that cryptographic feasibility results can change by moving from the classical to the quantum world. With this in mind, we study the feasibility of realizing functionalities in the framework of universal composability, with respect to both computational and information- theoretic security. With respect to computational security, we show that existing feasibility results carry over unchanged from the classical to the quantum world; a functionality is \"trivial\" (i.e., can be realized without setup) in the quantum world if and only if it is trivial in the classical world. The same holds with regard to functionalities that are complete (i.e., can be used to realize arbitrary other functionalities).

In the information-theoretic setting, the quantum and classical worlds differ. In the quantum world, functionalities in the class we consider are either complete, trivial, or belong to a family of simultaneous-exchange functionalities (e.g., XOR). However, other results in the information- theoretic setting remain roughly unchanged.



00:17 [Pub][ePrint] Privately Evaluating Decision Trees and Random Forests, by David J. Wu and Tony Feng and Michael Naehrig and Kristin Lauter

  Decision trees and random forests are common classifiers with widespread use. In this paper, we develop two protocols for privately evaluating decision trees and random forests. We operate in the standard two-party setting where the server holds a model (either a tree or a forest), and the client holds an input (a feature vector). At the conclusion of the protocol, the client learns only the model\'s output on its input and a few generic parameters concerning the model; the server learns nothing. The first protocol we develop provides security against semi-honest adversaries. Next, we show an extension of the semi-honest protocol that obtains one-sided security against malicious adversaries. We implement both protocols and show that both variants are able to process trees with several hundred decision nodes in just a few seconds and a modest amount of bandwidth. Compared to previous semi-honest protocols for private decision tree evaluation, we demonstrate tenfold improvements in computation and bandwidth.





2015-04-28
14:31 [Job][New] PhD student, University College London

  Applications are invited for a PhD position in the field of cryptography in the Information Security Group at the UCL Department of Computer Science, to be supervised by Dr. Sarah Meiklejohn. The position is funded partially by Microsoft Research (MSR) Cambridge and will be co-supervised by Dr. Markulf Kohlweiss, an MSR researcher.

The successful applicant will study the topic of controlled malleability in non-interactive zero-knowledge proofs, which has already been demonstrated to provide useful constructions such as compact verifiable shuffles, delegatable anonymous credentials, and more. The aim of this project is to improve on existing methods for constructing controlled-malleable primitives, both in terms of efficiency and in terms of the variety of cryptographic operations that can be performed, as well as to improve their usability and interoperability with existing cryptographic primitives.

We expect a candidate to have a strong degree in Computer Science, Mathematics, or a related MSc course. A good mathematics background and a willingness to become fluent with modern cryptographic constructions is necessary. The studentship is open to all, but fully covers only UK/EU fees.

05:20 [Event][New] SSR 2015: Security Standardisation Research 2015

  Submission: 26 June 2015
Notification: 10 September 2015
From December 15 to December 16
Location: Tokyo, Japan
More Information: http://ssr2015.com


00:17 [Pub][ePrint] Cluster Computing in Zero Knowledge, by Alessandro Chiesa and Eran Tromer and Madars Virza

  Large computations, when amenable to distributed parallel execution, are often executed on computer clusters, for scalability and cost reasons. Such computations are used in many applications, including, to name but a few, machine learning, webgraph mining, and statistical machine translation. Oftentimes, though, the input data is private and only the result of the computation can be published. Zero-knowledge proofs would allow, in such settings, to verify correctness of the output without leaking (additional) information about the input.

In this work, we investigate theoretical and practical aspects of *zero-knowledge proofs for cluster computations*. We design, build, and evaluate zero-knowledge proof systems for which:

(i) a proof attests to the correct execution of a cluster computation; and

(ii) generating the proof is itself a cluster computation that is similar in structure and complexity to the original one.

Concretely, we focus on MapReduce, an elegant and popular form of cluster computing.

Previous zero-knowledge proof systems can in principle prove a MapReduce computation\'s correctness, via a monolithic NP statement that reasons about all mappers, all reducers, and shuffling. However, it is not clear how to generate the proof for such monolithic statements via parallel execution by a distributed system. Our work demonstrates, by theory and implementation, that proof generation can be similar in structure and complexity to the original cluster computation.

Our main technique is a bootstrapping theorem for succinct non-interactive arguments of knowledge (SNARKs) that shows how, via recursive proof composition and Proof-Carrying Data, it is possible to transform any SNARK into a *distributed SNARK for MapReduce* which proves, piecewise and in a distributed way, the correctness of every step in the original MapReduce computation as well as their global consistency.





2015-04-24
21:17 [Pub][ePrint] Cryptography from Post-Quantum Assumptions, by Raza Ali Kazmi

  In this thesis we present our contribution in the field of post-quantum cryptography.

We introduce a new notion of {\\em weakly Random-Self-Reducible} public-key cryptosystem and show how it can be used to implement secure Oblivious Transfer.

We also show that two recent (Post-quantum) cryptosystems can be considered as

{\\em weakly Random-Self-Reducible}. We introduce a new problem called Isometric Lattice

Problem and reduce graph isomorphism and linear code

equivalence to this problem. We also show that this problem has a perfect zero-knowledge interactive proof with respect to a malicious verifier;

this is the only hard problem in lattices that is known to have this property.



18:17 [Pub][ePrint] Bounds on surmising remixed keys, by Daniel R. L. Brown

  A remixed key is derived from a secret source key by applying a

public but unpredictable random function to the source key. A

remixed key models a key derived from a shared secret and a public

unpredictable salt, using a common, deterministic, pseudorandom

function---which is somewhat like TLS record-layer keys.

This report tries to validate the intuition that remixed keys are not

easy to surmise, in other words, that remixing does not introduce an

exploitable spike in the probability distribution of the remixed

key. The report provides pencil-and-paper proofs of numerical

bounds on the probability that an adversary can surmise a remixed

key, assuming a uniformly random source key and remix function. The

proofs are derived from a proof of an asymptotic result on

probability theory in a textbook by Shoup.