International Association for Cryptologic Research

IACR News Central

Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also get this service via

To receive your credentials via mail again, please click here.

You can also access the full news archive.

Further sources to find out about changes are CryptoDB, ePrint RSS, ePrint Web, Event calender (iCal).

2013-06-25
18:17 [Pub][ePrint] Computational Fuzzy Extractors, by Benjamin Fuller and Xianrui Meng and Leonid Reyzin

  Fuzzy extractors derive strong keys from noisy sources. Their security is defined information- theoretically, which limits the length of the derived key, sometimes making it too short to be useful. We ask whether it is possible to obtain longer keys by considering computational security, and show the following.

-Negative Result: Noise tolerance in fuzzy extractors is usually achieved using an information reconciliation component called a \"secure sketch.\" The security of this component, which directly affects the length of the resulting key, is subject to lower bounds from coding theory. We show that, even when defined computationally, secure sketches are still subject to lower bounds from coding theory. Specifically, we consider two computational relaxations of the information-theoretic security requirement of secure sketches, using conditional HILL entropy and unpredictability entropy. For both cases we show that computational secure sketches cannot outperform the best information-theoretic secure sketches in the case of high-entropy Hamming metric sources.

-Positive Result: We show that the negative result can be overcome by analyzing computational fuzzy extractors directly. Namely, we show how to build a computational fuzzy extractor whose output key length equals the entropy of the source (this is impossible in the information-theoretic setting). Our construction is based on the hardness of the Learning with Errors (LWE) problem, and is secure when the noisy source is uniform or symbol-fixing (that is, each dimension is either uniform or fixed). As part of the security proof, we show a result of independent interest, namely that the decision version of LWE is secure even when a small number of dimensions has no error.



18:17 [Pub][ePrint] The Improved Cube Attack on Grain-v1, by Yongjuan Wang and Liren Ding and Wenbao Han and Xiangyu Wang

  The crucial problem of cube attack is the selection of cube set, which also being the most time-consuming process. This paper designs a new search algorithm which generates several linear equations through one cube set and applies cube attack to simplified version of Grain-v1algorithem. Our attack directly recovers 14 bits of the secret key when the initialization rounds in Grain-v1is 75 and finds 5 linear expressions about another 28 bits of the key.



18:17 [Pub][ePrint] Unconditional Tightness Bounds for Generic Reductions: The Exact Security of Schnorr Signatures, Revisited, by Nils Fleischhacker and Tibor Jager and Dominique Schröder

  A long line of research investigates the existence of tight security reductions for the Schnorr signature scheme. Most of these works presented lower tightness bounds, most recently Seurin (Eurocrypt 2012) showed that under certain assumptions the non-tight security proof for Schnorr signatures by Pointcheval and Stern (Eurocrypt 1996) is essentially optimal. All previous works in this direction share the same restrictions: The results hold only under the interactive one-more discrete logarithm assumption, they only consider algebraic reductions, and they only rule out tight reductions from the (one-more) discrete logarithm problem. The existence of a tight reduction from weaker computational problems, like CDH or DDH, remained open.

In this paper we introduce a new meta-reduction technique, which allows to prove lower bounds for the large and very natural class of generic reductions. A generic reduction is independent of a particular representation of group elements. Most reductions in state-of-the-art security proofs have this desirable property. This new approach allows to show unconditionally that there is no tight generic reduction from any natural computational problem \\Pi defined over algebraic groups (including even interactive problems) to breaking Schnorr signatures, unless solving \\Pi is easy.



16:43 [Job][New] Assistant Professor (tenure track), Technische Universiteit Eindhoven

  We are looking for a candidate who meets the following requirements:

  • A PhD degree in Mathematics or Computer Science;

  • Research experience in coding theory or coding theory;

  • Outstanding research achievements and promise for the future;

  • Excellent track record of international publications in leading journals and high-ranked conferences;

  • High potential for the acquisition of external research funds;

  • Readiness to supervise PhD projects;

  • Teaching experience and good teaching skills;

  • Good English speaking and writing skills, and a willingness to learn Dutch (all Master\\\'s and some Bachelor courses are given in English);

  • Basic Teaching Qualification (BKO): if the candidate in question is not in possession of a BKO certificate, he or she is required to meet this requirement within a maximum period of three years.

    The Department of Mathematics and Computer Science of the Eindhoven University of Technology (TU/e) has a vacancy for a Tenure Track Assistant Professor position for five years in the Coding and Crypto group (section Discrete Mathematics, DM).

    The tenure-track nature of the position will be as follows. The successful candidate will first be appointed for a fixed period of five years. Before the start of the contract, the department and the candidate negotiate a list of conditions for successful conversion. If the candidate meets these conditions at the end of the five years, the position becomes permanent; if not, the temporary position is not continued.

16:36 [Event][New] PETShop'13: PETShop: Workshop on Language Support for Privacy Enhancing Technologies

  Submission: 4 August 2013
Notification: 23 August 2013
From November 4 to November 4
Location: Berlin, Germany
More Information: http://forsyte.at/petshop-2013/


12:35 [Event][New] RISC '13: The 5th International Workshop on RFID/IoT Security and Cryptography

  Submission: 13 September 2013
Notification: 11 October 2013
From December 9 to December 11
Location: London, UK
More Information: http://www.icitst.org/Workshops.html




2013-06-24
16:47 [Event][New] SEC@SAC'14: 13th Computer Security track at the 29th ACM Symposium on Applied Computing

  Submission: 13 September 2013
Notification: 15 November 2013
From March 24 to March 28
Location: Gyeongju, Korea
More Information: http://www.dmi.unict.it/~giamp/sac/cfp2014.php


09:17 [Forum] [IACR Publication Reform] Re: two-stage review process by cbw

  We don\'t have space limitations anymore - we can accept any good paper from now on :-) Best, Christopher Orr Wrote: > you are susceptible > to variance (Y should be O(X), IMHO) in the number > of accepted papers, or some decent papers gets > rejected due to lack of space. From: 2013-24-06 09:11:00 (UTC)



2013-06-23
21:17 [Forum] [IACR Publication Reform] Re: two-stage review process by Orr

  But in this case, if you accept X papers, you need to pass to Stage 2 X+Y papers (assuming some will fail the testing), and then, you are susceptible to variance (Y should be O(X), IMHO) in the number of accepted papers, or some decent papers gets rejected due to lack of space. In addition, why not to just separate the submission into two parts: abstract, and the rest. Then, interested committee members could immediately check the details if they wish. Which is btw, what happens now, but with shorter abstract. From: 2013-23-06 18:32:14 (UTC)

15:17 [Forum] [IACR Publication Reform] Re: Testable change by cbw

  Thanks for the insight! My question would be a different one though: Does rebuttal/rebattle [1] change something from the perspective of the /reviewers/: Do you write your review more carefully (it could be questioned) or not? If the first happens, I guess it\'s worth the overhead. If no, you are right. In this case, it\'s not worth the bother. Best, Christopher [1] I do prefer the second term ;-) From: 2013-23-06 14:10:09 (UTC)

12:17 [Forum] [IACR Publication Reform] two-stage review process by Joan Daemen

  Dear all, Here a proposal that aims at reducing review workload. The idea is to split the review of a paper in two stages. It implies that each paper has two parts: - an abstract aimed at the non-specialized reader clearly stating the contribution of the paper (selling it, actually). So the abstract should probably be longer that what we have now (say a 2-page limit) - a technical part that will typically be more specialized Stage 1: the paper is reviewed by a relatively large number of people from different sub-disciplines, based on the abstract only. Reviewers should assume that the technical part will deliver what the abstract announces. If the paper survives this phase, it proceeds to phase 2. Stage 2: a few specialized reviewers check in detail whether the technical part delivers on the promise made in the abstract. If so, the paper is accepted. This includes verification of proofs, claimed attack complexity etc. At least in theory this may reduce the workload as most reviewers only have to read the abstract. Forcing the authors to write an abstract aimed at a wider audience has the additional benefit that papers may become more accessible to people working in other sub-disciplines. I realise that whether this really works depends on how it is implemented. For example, something must be built in against overselling. This could be done by having a system with (negative) points where each co-author gets a point when his paper passes stage 1 but not stage 2 and these points are somehow taken into account in stage 1. And of course there are many other details that may make this a success or a failure. But let\'s first see if there is support for the basic idea in the first place. Joan From: 2013-23-06 11:06:21 (UTC)