*10:17* [Pub][ePrint]
Studying Potential Side Channel Leakages on an Embedded Biometric Comparison System, by Maël Berthier and Yves Bocktaels and Julien Bringer and Hervé Chabanne and Taoufik Chouta and Jean-Luc Danger
We study in this work the potential side channel leakages of a hardware biometric comparison system that has been designed for fingerprints. An embedded biometric system for comparison aims at comparing a stored biometric data with a freshly acquired one without the need to send the stored biometric data outside the system. Here one may try to retrieve the stored data via side channel, similarly as for embedded cryptographic modules where one may try to exploit side channel for attacking the modules.

On one hand, we show that we can find partial information by the means of simple Side Channel Analysis that may help to retrieve the stored fingerprint. On the other hand, we illustrate that reconstructing the fingerprint remains not trivial and we give some simple countermeasures to protect further the comparison algorithm.

*19:17* [Pub][ePrint]
(De-)Constructing TLS, by Markulf Kohlweiss and Ueli Maurer and Cristina Onete and Bjoern Tackmann and Daniele Venturi
One of the most important applications of cryptography is the establishment of secure communication channels between two entities (e.g. a client and a server), and the protocol most widely used for this purpose is TLS.A key goal of research in cryptography is to provide security proofs for cryptographic protocols. This task is particularly difficult if the considered protocol has not been designed with provable security in mind, as is the case for TLS.

Results on provable security differ with respect to (1) the assumptions made and (2) the statement that is proved to follow from the assumptions. It is important that the proved statement is of a form that allows for both comparisons of protocol performance, and for direct use in the proof of a higher-level protocol. Security statements should thus be exact (as opposed to asymptotic), giving precise upper bounds for the security level guaranteed by a protocol. Furthermore, a key to analyzing and designing cryptographic protocols is a modularization in which the role of each cryptographic primitive (e.g. encryption) or mechanism (e.g. nonce exchange) is made explicit, and the security of its application is proved in isolation, once and for all. The constructive cryptography framework provides a sound instantiation of this approach. A modular step constructs a specific resource from certain (assumed) resources, and the overall protocol is the composition of several such construction steps. The security proof for the overall protocol follows directly from the composition theorem as well as the individual (reasonably simple) security proofs for the modules. Moreover, the actual security statement for the overall protocol is of a standardized form, in terms of a resource, which makes it straight-forward to use the protocol in a higher-level context, with the overall security proof again following from the composition theorem.

In this paper, we provide such a constructive treatment of TLS. We provide a deconstruction of TLS into modular steps and a security proof for each step which, compared to previous work, results in the above mentioned advantages. For the key-exchange step in particular, we analyze the RSA-based and both Diffie-Hellman-based variants (with static and ephemeral server key) under a non-randomizability assumption for RSA-PKCS and the Gap Diffie-Hellman assumption, respectively; in all cases we make use of random oracles. In general, since the design of TLS is not modular, the constructive decomposition is less fine-grained than one might wish to have and than it is for a modular design. This paper therefore also suggests new insights into the intrinsic problems incurred by a non-modular protocol design such as that of TLS.

*19:17* [Pub][ePrint]
Online/Offline Attribute-Based Encryption, by Susan Hohenberger and Brent Waters
Attribute-based encryption (ABE) is a type of public key encryption that allows users to encrypt and decrypt messages based on user attributes. For instance, one can encrypt a message to any usersatisfying the boolean formula (``crypto conference attendee\'\' AND ``PhD student\'\') OR ``IACR member\'\'. One drawback is that encryption and key generation computational costs scale with the complexity of the access policy or number of attributes. In practice, this makes

encryption and user key generation a possible bottleneck for some applications.

To address this problem, we develop new techniques for ABE that split the computation for these algorithms into two phases: a preparation phase that does the vast majority of the work to encrypt a message or create a secret key *before* it knows the message or the attribute list/access control policy that will be used (or even the size of the list or policy). A second phase can then rapidly assemble an ABE ciphertext or key when the specifics become known. This concept is sometimes called ``online/offline\'\' encryption when only the message is unknown during the preparation phase; we note that the addition of unknown attribute lists and access policies makes ABE significantly more challenging.

One motivating application for this technology is mobile devices: the preparation work can be performed while the phone is plugged into a power source, then it can later rapidly perform ABE operations on the move without significantly draining the battery.

*19:17* [Pub][ePrint]
Solving Random Subset Sum Problem by $l_{p}$-norm SVP Oracle, by Gengran Hu and Yanbin Pan and Feng Zhang
It is well known that almost all random subset suminstances with density less than 0.6463... can be solved with an

$l_{2}$-norm SVP oracle by Lagarias and Odlyzko. Later, Coster

\\emph{et al.} improved the bound to 0.9408... by using a different

lattice. In this paper, we generalize this classical result to

$l_p$-norm. More precisely, we show that for $p\\in \\mathbb{Z}^{+}$,

an $l_p$-norm SVP oracle can be used to solve almost all random

subset sum instances with density bounded by $\\delta_p$, where

$\\delta_1=0.5761$ and $\\delta_p =

1/(\\frac{1}{2^p}\\log_2(2^{p+1}-2)+\\log_2(1+\\frac{1}{(2^p-1)(1-(\\frac{1}{2^{p+1}-2})^{(2^p-1)})})))$

for $p\\geq 3$(asymptotically, $\\delta_p\\approx 2^p/(p+2)$). Since

$\\delta_p$ goes increasingly to infinity when $p$ tends to infinity,

it can be concluded that an $l_p$-norm SVP oracle with bigger $p$

can solve more subset sum instances. An interesting phenomenon is

that an $l_p$-norm SVP oracle with $p\\geq 3$ can help solve almost

all random subset sum instances with density one, which are thought

to be the most difficult instances.