Get an update on changes of the IACR web-page here. For questions, contact newsletter (at) iacr.org. You can also receive updates via:
To receive your credentials via mail again, please click here.
You can also access the full news archive.
This Thesis makes a humble attempt to bridge the well-known gap between the use of cryptography in practice and the theory, or lack thereof, behind it. With a concentration on digital signature scheme, we endeavour forth to fill in the gaps and, as far as possible, join the two edges of our chasm. To this end, we start from primitives that lie close to, if not at the very edge, of one side and try to push ourselves closer and closer to the other end of the spectrum. \r\n\r\n
\r\nFor our first leap, we start from the side of practice. Our starting point is one of the most widely used and implemented signature schemes, RSA-FDH. Despite it\'s wide acceptance, RSA-FDH has a loose security proof and is not as theoretically secure as it is assumed to be in practice. Hence, we have found our first gap to bridge. We present a tight security proof for RSA-FDH which meets the security expectations that have until now, been assumed by practitioners.\r\n\r\n
\r\nThe next step sees us starting from the side of theory looking towards practice. Despite our satisfactory results vis à vis RSA-FDH, they are in the Random Oracle Model, which is shaky grounds at best. With this in mind, we look towards building tightly secure signatures in the standard model. Such signatures were known previously, but from a limited number of assumptions. We examined these schemes closer and were able to show a generic framework that was implicitly used to construct all of them. Utilising this framework we are able to construct tightly secure signatures from a multitude of assumptions. Sadly, our signatures fall a few hand spans short and are just gripping the edge of practicality.\r\n\r\n
\r\nThe final hop we make does not take us all the way from theory to practice, but some headway is gained. The last step could be seen not only as a bridge between theory and practice, but also between our first two results. Recent results have shown that using obfuscation, one can prove RSA-FD[...]
Our result provides a new pathway to iO. For example, by combining our result with the FE scheme of Garg et al. [ePrint 2014/666], we obtain a new construction of iO based on the sub-exponential GGHZ assumption over composite-order multilinear maps.
We also identify a \"simple\" function family for FE that suffices for our general result. We show that the function family F is complete, where every f in F consists of three evaluations of a Weak PRF followed by finite operations. We believe that this may be useful for realizing iO from weaker assumptions in the future.
These new curves were selected for their good performance and security perspectives.
Cryptosystems based on elliptic curves in embedded devices can be vulnerable to Side-Channel Attacks (SCA), such as the Simple Power Analysis (SPA) or the Differential Power Analysis (DPA).
In this paper, we analyze the existence of special points whose use in SCA is known as Same Value Analysis (SVA), for Edwards curves. These special points show up as internal collisions under power analysis. Our results indicate that no Edwards curve is safe from such an attacks.
was released. This algorithm has certain useful features for hardware
and software implementations, i.e., simple ARX operations, non-S-box
architecture, and 32-bit word size. These features are realized in several
platforms for practical usage with high performance and low overheads.
In this paper, we further improve 128-, 192- and 256-bit LEA encryption
for low-end embedded processors. Firstly we present speed optimization
methods. The methods split a 32-bit word operation into four byte-wise
operations and avoid several rotation operations by taking advantages of
efficient byte-wise rotations. Secondly we reduce the code size to ensure
minimum code size.We nd the minimum inner loops and optimize them
in an instruction set level. After then we construct the whole algorithm
in a partly unrolled fashion with reasonable speed. Finally, we achieved
the fastest LEA implementations, which improves performance by 10.9%
than previous best known results. For size optimization, our implemen-
tation only occupies the 280B to conduct LEA encryption. After scaling,
our implementation achieved the smallest ARX implementations so far,
compared with other state-of-art ARX block ciphers such as SPECK and
The key size of this scheme and complexity for enciphering /deciphering become to be small enough to handle.
In this work, we consider two very natural extensions of secret sharing. In the first, which we call distributed secret sharing, there is no trusted dealer at all, and instead the role of the dealer is distributed amongst the parties themselves. Distributed secret sharing can be thought of as combining the features of multiparty non-interactive key exchange and standard secret sharing, and may be useful in settings where the secret is so sensitive that no one individual dealer can be trusted with the secret. Our second notion is called functional secret sharing, which incorporates some of the features of functional encryption into secret sharing by providing more fine-grained access to the secret. Qualified subsets of parties do not learn the secret, but instead learn some function applied to the secret, with each set of parties potentially learning a different function.
Our main result is that both of the extensions above are equivalent to several recent cutting-edge primitives. In particular, general-purpose distributed secret sharing is equivalent to witness PRFs, and general-purpose functional secret sharing is equivalent to indistinguishability obfuscation. Thus, our work shows that it is possible to view some of the recent developments in cryptography through a secret sharing lens, yielding new insights about both these cutting-edge primitives and secret sharing.
In this paper, we apply the list decoding method to solve search version of LWE. Our algorithm runs in probabilistic polynomial time and results in specific security estimates for a large range of parameters. To our knowledge, it is the first time to apply the list decoding method to recover the key of LWE.
Our algorithm improves Laine and Lauter\'s result.