## IACR News

Here you can see all recent updates to the IACR webpage. These updates are also available:

#### 27 January 2020

###### University of Lyon, CNRS, Saint-Etienne, France - Laboratoire Hubert Curien
Job Posting
The Hubert Curien laboratory is a joint research unit of the University of Lyon, Saint-Etienne, the National Research Centre "CNRS". Its Secure Embedded Systems & Hardware Architectures (SESAM) Group is one of the leading European research groups in the areas of hardware security. The SESAM group of the Hubert Curien Lab explores three main aspects of hardware security: - the random number generation and physical unclonable function implementation in logic devices, including design, characterization, test and security evaluation - the design of hardware architectures resistant to passive and active physical attacks, - the security of systems on chip (SoC) This group offer a post-doc research position to work on one of these three aspects of hardware security. We are looking for an excellent candidate with PhD and track record in hardware security. The Post-Doc position will start as soon as possible to finish at the end of March 2021 (no-flexible ending date), it could extend for one more year in function of the scientific work produced during the first year.

Closing date for applications:

Contact: To apply please send to Prof. L. Bossuet your detailed CV (with publication list), motivation for applying (1 page) and names of at least two people who can provide reference letters (e-mail).

#### 26 January 2020

###### Eman Salem Alashwali, Pawel Szalachowski, Andrew Martin
ePrint Report
If two or more identical HTTPS clients, located at different geographic locations (regions), make an HTTPS request to the same domain (e.g. example.com), on the same day, will they receive the same HTTPS security guarantees in response? Our results give evidence that this is not always the case. We conduct scans for the top 250000 most visited domains on the Internet, from clients located at five different regions: Australia, Brazil, India, the UK, and the US. Our scans gather data from both application (URLs and HTTP headers) and transport (servers' selected TLS version, ciphersuite, and certificate) layers. Overall, we find that HTTPS inconsistencies at the application layer are higher than those at the transport layer. We also find that HTTPS security inconsistencies are strongly related to URLs and IPs diversity among regions, and to a lesser extent to the presence of redirections. Further manual inspection shows that there are several reasons behind URLs diversity among regions such as downgrading to the plain-HTTP protocol, using different subdomains, different TLDs, or different home page documents. Furthermore, we find that downgrading to plain-HTTP is related to websites' regional blocking. We also provide attack scenarios that show how an attacker can benefit from HTTPS security inconsistencies, and introduce a new attack scenario which we call the "region confusion" attack. Finally, based on our observations, we draw some recommendations including the need for testing tools for domain administrators and users that help to mitigate and detect regional domains' inconsistencies, standardising regional domains format with the same-origin policy (of domains) in mind, standardising secure URL redirections, and avoid redirections whenever possible.
###### Kentaro Tamura, Yutaka Shikano
ePrint Report
Quantum random number generators (QRNGs) produce theoretically unpredictable random numbers. A typical QRNG is implemented in quantum optics [Herrero-Collantes, M., Garcia-Escartin, J. C.: Quantum Random Number Generators. Rev. Mod. Phys. \textbf{89}, 015004 (2017)]. Quantum computers become QRNGs when given certain programs. The simplest example of such a program applies the Hadamard gate on all qubits and performs measurement. As a result of repeatedly running this program on a 20-qubit superconducting quantum computer (IBM 20Q Tokyo), we obtained a sample with a length of 43,560. However, statistical analysis showed that this sample was biased and correlated. One of the post-processed samples passed statistical tests. To show the effectiveness of post-processing, a larger sample size is required. The present study of quantum random number generation and statistical testing may provide a potential candidate for benchmarking tests of actual quantum computing devices.
###### Thomas Häner, Samuel Jaques, Michael Naehrig, Martin Roetteler, Mathias Soeken
ePrint Report
We present improved quantum circuits for elliptic curve scalar multiplication, the most costly component in Shor's algorithm to compute discrete logarithms in elliptic curve groups. We optimize low-level components such as reversible integer and modular arithmetic through windowing techniques and more adaptive placement of uncomputing steps, and improve over previous quantum circuits for modular inversion by reformulating the binary Euclidean algorithm. Overall, we obtain an affine Weierstrass point addition circuit that has lower depth and uses fewer T gates than previous circuits. While previous work mostly focuses on minimizing the total number of qubits, we present various trade-offs between different cost metrics including the number of qubits, circuit depth and T-gate count. Finally, we provide a full implementation of point addition in the Q# quantum programming language that allows unit tests and automatic quantum resource estimation for all components.
###### Charbel Saliba, Laura Luzzi, Cong Ling
ePrint Report
We consider a key encapsulation mechanism (KEM) based on ring-LWE where reconciliation is performed on an $N$-dimensional lattice using Wyner-Ziv coding. More precisely, we consider Barnes-Wall lattices and use Micciancio and Nicolosi's bounded distance decoder with polynomial complexity $\mathcal{O}(N \log^2(N))$. We show that in the asymptotic regime for large $N$, the achievable key rate is $\Theta(\log N)$ bits per dimension, while the error probability $P_e$ vanishes exponentially in $N$. Unlike previous works, our scheme does not require a dither.
###### Rishiraj Bhattacharyya
ePrint Report
The efficiency of a black-box reduction is an important goal of modern cryptography. Traditionally, the time complexity and the success probability were considered as the main aspects of efficiency measurements. In CRYPTO 2017, Auerbach et al. introduced the notion of memory-tightness in cryptographic reductions and showed a memory-tight reduction of the existential unforgeability of the RSA-FDH signature scheme. Unfortunately, their techniques do not extend directly to the reductions involving intricate RO-programming. The problem seems to be inherent as all the other existing results on memory-tightness are lower bounds and impossibility results. In fact, Auerbach et al. conjectured that a memory-tight reduction for IND-CCA security of Hashed-ElGamal KEM is impossible.   -We refute the above conjecture. Using a simple RO simulation technique, we provide memory-tight reductions of IND-CCA  security of the Cramer-Shoup and the ECIES version of Hashed-ElGamal KEM.

- We prove memory-tight reductions for different variants of Fujisaki-Okamoto Transformation. We analyze the modular transformations introduced by  Hofheinz,  H\"{o}vermanns and Kiltz (TCC 2017). In addition to the constructions involving implicit rejection, we present a memory-tight reduction for the IND-CCA  security of the transformation ${\mbox{QFO}_m^\perp}$. Our techniques can withstand correctness-errors, and applicable to several lattice-based KEM candidates.
###### Daniel R. L. Brown
ePrint Report
A nothing-up-my-sleeve number is a cryptographic constant, such as a field size in elliptic curve cryptography, with qualities to assure users against subversion of the number by the system designer. A number with low Kolmogorov descriptional complexity resists being subverted to the extent that the speculated subversion would leave a trace that cannot be hidden within the short description. The roll programming language, a version of Godel's 1930s definition of computability, can somewhat objectively quantify low descriptional complexity, a nothing-up-my-sleeve quality, of a number. For example, $(2^{127}-1)^2$ and $2^{255}-19$ can be described with roll programs of 58 and 84 words.
###### Fabio Banfi, Ueli Maurer
ePrint Report
We study anonymity of probabilistic encryption (pE) and probabilistic authenticated encryption (pAE). We start by providing concise game-based security definitions capturing anonymity for both pE and pAE, and then show that the commonly used notion of indistinguishability from random ciphertexts (IND\$) indeed implies the anonymity notions for both pE and pAE. This is in contrast to a recent work of Chan and Rogaway (Asiacrypt 2019), where it is shown that IND\$-secure nonce-based authenticated encryption can only achieve anonymity if a sophisticated transformation is applied. Moreover, we also show that the Encrypt-then-MAC paradigm is anonymity-preserving, in the sense that if both the underlying probabilistic MAC (pMAC) and pE schemes are anonymous, then also the resulting pAE scheme is. Finally, we provide a composable treatment of anonymity using the constructive cryptography framework of Maurer and Renner (ICS 2011). We introduce adequate abstractions modeling various kinds of anonymous communication channels for many senders and one receiver in the presence of an active man-in-the-middle adversary. Then we show that the game-based notions indeed are anonymity-preserving, in the sense that they imply constructions between such anonymous channels, thus generating authenticity and/or confidentiality as expected, but crucially retaining anonymity if present.

#### 23 January 2020

###### Ben Kreuter, Tancrede Lepoint, Michele Orru, Mariana Raykova
ePrint Report
We present a cryptographic construction for anonymous tokens with private metadata bit, a primitive that enables an issuer to provide a user with anonymous trust tokens that can embed a single private metadata bit, which is accessible only to the party who holds the secret authority key and is private with respect to anyone else. Our construction provides unforgeability, unlinkability and privacy for the metadata bit properties.
###### Dimitrios Sikeridis, Panos Kampanakis, Michael Devetsikiotis
ePrint Report
The potential development of large-scale quantum computers is raising concerns among IT and security research professionals due to their ability to solve (elliptic curve) discrete logarithm and integer factorization problems in polynomial time. All currently used public key algorithms would be deemed insecure in a post-quantum (PQ) setting. In response, the National Institute of Standards and Technology (NIST) has initiated a process to standardize quantum-resistant crypto algorithms, focusing primarily on their security guarantees. Since PQ algorithms present significant differences over classical ones, their overall evaluation should not be performed out-of-context. This work presents a detailed performance evaluation of the NIST signature algorithm candidates and investigates the imposed latency on TLS 1.3 connection establishment under realistic network conditions. In addition, we investigate their impact on TLS session throughput and analyze the trade-off between lengthy PQ signatures and computationally heavy PQ cryptographic operations.

Our results demonstrate that the adoption of at least two PQ signature algorithms would be viable with little additional overhead over current signature algorithms. Also, we argue that many NIST PQ candidates can effectively be used for less time-sensitive applications, and provide an in-depth discussion on the integration of PQ authentication in encrypted tunneling protocols, along with the related challenges, improvements, and alternatives. Finally, we propose and evaluate the combination of different PQ signature algorithms across the same certificate chain in TLS. Results show a reduction of the TLS handshake time and a significant increase of a server's TLS tunnel connection rate over using a single PQ signature scheme.
###### Thomas Agrikola, Dennis Hofheinz, Julia Kastner
ePrint Report
We provide a standard-model implementation (of a relaxation) of the algebraic group model (AGM, [Fuchsbauer, Kiltz, Loss, CRYPTO 2018]). Specifically, we show that every algorithm that uses our group is algebraic, and hence must know'' a representation of its output group elements in terms of its input group elements. Here, must know'' means that a suitable extractor can extract such a representation efficiently. We stress that our implementation relies only on falsifiable assumptions in the standard model, and in particular does not use any knowledge assumptions.

As a consequence, our group allows to transport a number of results obtained in the AGM into the standard model, under falsifiable assumptions. For instance, we show that in our group, several Diffie-Hellman-like assumptions (including computational Diffie-Hellman) are equivalent to the discrete logarithm assumption. Furthermore, we show that our group allows to prove the Schnorr signature scheme tightly secure in the random oracle model.

Our construction relies on indistinguishability obfuscation, and hence should not be considered as a practical group itself. However, our results show that the AGM is a realistic computational model (since it can be instantiated in the standard model), and that results obtained in the AGM are also possible with standard-model groups.
ePrint Report
A blockchain is redactable if a private key holder (e.g. a central authority) can change any single block without violating integrity of the whole blockchain, but no other party can do that. In this paper, we offer a simple method of constructing redactable blockchains inspired by the ideas underlying the well-known RSA encryption scheme. Notably, our method can be used in conjunction with any reasonable hash function that is used to build a blockchain. Public immutability of a blockchain in our construction is based on the computational hardness of the RSA problem and not on properties of the underlying hash function. Corruption resistance is based on the computational hardness of the discrete logarithm problem.
###### Pranab Chakraborty, Subhamoy Maitra
ePrint Report
In this note we provide a theoretical argument towards an unsolved question related to Mantin's Digraph Repetition Bias (2005) that is observed in the key-stream of RC4. The open question, that depends on the observation that arrival of four consecutive same bytes in RC4 key-stream is slightly negatively biased, was posed by Bricout et al [Des. Codes Cryptogr. (2018) 86:743-770] in 2016.
###### Taylor R Campbell
ePrint Report
We present Daence, a deterministic authenticated cipher based on a pseudorandom function family and a universal hash family, similar to SIV. We recommend instances with Salsa20 or ChaCha, and Poly1305, for high performance, high security, and easy deployment.
###### Raymond Cheng, William Scott, Elisaweta Masserova, Irene Zhang, Vipul Goyal, Thomas Anderson, Arvind Krishnamurthy, Bryan Parno
ePrint Report
Talek is a private group messaging system that sends messages through potentially untrustworthy servers, while hiding both data content and the communication patterns among its users. Talek explores a new point in the design space of private messaging; it guarantees access sequence indistinguishability, which is among the strongest guarantees in the space, while assuming an anytrust threat model, which is only slightly weaker than the strongest threat model currently found in related work. Our results suggest that this is a pragmatic point in the design space, since it supports strong privacy and good performance: we demonstrate a 3-server Talek cluster that achieves throughput of 9,433 messages/second for 32,000 active users with 1.7-second end-to-end latency. To achieve its security goals without coordination between clients, Talek relies on information-theoretic private information retrieval. To achieve good performance and minimize server-side storage, Talek intro- duces new techniques and optimizations that may be of independent interest, e.g., a novel use of blocked cuckoo hashing and support for private notifications. The latter provide a private, efficient mechanism for users to learn, without polling, which logs have new messages.

#### 21 January 2020

###### Queen's University Belfast, Centre for Secure Information Technologies, Belfast, UK
Job Posting
The Centre for Secure Information Technologies (CSIT) at Queen's University Belfast is seeking motivated PhD students to work on the following research topics:
• Secure IoT devices using Digital Fingerprint
• Investigating Security Vulnerabilities of Approximate Computing
• Practical and Post-quantum IoT security
• Practical Privacy-preserving homomorphic analytics
• Post Quantum Anonymous Credential
• Functional Encryption and its Application

For further information and how to apply, please visit the QUB website for PhD study: http://www.qub.ac.uk/schools/eeecs/Research/PhDStudy/

Closing date for applications:

Contact: Ciara Rafferty: c.m.rafferty@qub.ac.uk

• ###### Jake Massimo, Kenneth G. Paterson
ePrint Report
Primality testing is a basic cryptographic task. But developers today are faced with complex APIs for primality testing, along with documentation that fails to clearly state the reliability of the tests being performed. This leads to the APIs being incorrectly used in practice, with potentially disastrous consequences. In an effort to overcome this, we present a primality test having a simplest-possible API: the test accepts a number to be tested and returns a Boolean indicating whether the input was composite or probably prime. For all inputs, the output is guaranteed to be correct with probability at least $1 - 2^{128}$. The test is performant: on random, odd, 1024-bit inputs, it is faster than the default test used in OpenSSL by 17\%. We investigate the impact of our new test on the cost of random prime generation, a key use case for primality testing. The OpenSSL developers have adopted our suggestions in full; our new API and primality test are scheduled for release in OpenSSL 3.0.
###### Geng Wang, Ming Wan, Zhen Liu, Dawu Gu
ePrint Report
Dual system encryption is an important method used in pairing-based cryptography for constructing fully secure IBE, ABE and FE schemes. A long time open question is that, whether there is an analogue of dual system method in lattice, which can be used to prove the full security of lattice-based ABE or FE schemes. We solve this problem in this paper.

We do this by introducing a new primitive called approximate inner product encryption (aIPE), which is the approximate version of the well known inner product encryption. We show that a fully secure ABE supporting CNF as its access policy can be constructed from a selectively secure aIPE and the LWE assumption. We also point out that the functionality of aIPE is included in FE for arbitrary circuits, which can be constructed from LWE assumption, hence the full security of our scheme can be totally based on the hardness of LWE.
###### Aurelien Greuet, Simon Montoya, Guenael Renault
ePrint Report
LAC is a Ring Learning With Error based cryptosystem that has been proposed to the NIST call for post-quantum standardization and passed the first round of the submission process. The particularity of LAC is to use an error-correction code ensuring a high security level with small key sizes and small ciphertext sizes. LAC team proposes a CPA secure cryptosystem, LAC.CPA, and a CCA secure one, LAC.CCA, obtained by applying the Fujisaki-Okamoto transformation on LAC.CPA. In this paper, we study the security of LAC Key Exchange (KE) mechanism, using LAC.CPA, in a misuse context: when the same secret key is reused for several key exchanges and an active adversary has access to a mismatch oracle. This oracle indicates information on the possible mismatch at the end of the KE protocol. In this context, we show that an attacker needs at most $8$ queries to the oracle to retrieve one coefficient of a static secret key. This result has been experimentally confirmed using the reference and optimized implementations of LAC. Since our attack can break the CPA version in a misuse context, the Authenticated KE protocol, based on the CCA version, is not impacted. However, this research provides a tight estimation of LAC resilience against this type of attacks.
###### Bezhad Abdolmaleki, Sebastian Ramacher, Daniel Slamanig
ePrint Report
Zero-knowledge proofs and in particular succinct non-interactive zero-knowledge proofs (so called zk-SNARKs) are getting increasingly used in real-world applications, with cryptocurrencies being the prime example. Simulation extractability (SE) is a strong security notion of zk-SNARKs which informally ensures non-malleability of proofs. This property is acknowledged as being highly important by leading companies in this field such as Zcash and supported by various attacks against the malleability of cryptographic primitives in the past. Another problematic issue for the practical use of zk-SNARKs is the requirement of a fully trusted setup, as especially for large-scale decentralized applications finding a trusted party that runs the setup is practically impossible. Quite recently, the study of approaches to relax or even remove the trust in the setup procedure, and in particular subversion as well as updatable zk-SNARKs (with latter being the most promising approach), has been initiated and received considerable attention since then. Unfortunately, so far SE-SNARKs with aforementioned properties are only constructed in an ad-hoc manner and no generic techniques are available.

In this paper we are interested in such generic techniques and therefore firstly revisit the only available lifting technique due to Kosba et al. (called COCO) to generically obtain SE-SNARKs. By exploring the design space of many recently proposed SNARK- and STARK-friendly symmetric-key primitives we thereby achieve significant improvements in the prover computation and proof size. Unfortunately, the COCO framework as well as our improved version (called OCOCO) is not compatible with updatable SNARKs. Consequently, we propose a novel generic lifting transformation called Lamassu. It is built using different underlying ideas compared to COCO (and OCOCO). In contrast to COCO it only requires key-homomorphic signatures (which allow to shift keys) covering well studied schemes such as Schnorr or ECDSA. This makes Lamassu highly interesting, as by using the novel concept of so called updatable signatures, which we introduce in this paper, we can prove that Lamassu preserves the subversion and in particular updatable properties of the underlying zk-SNARK. This makes Lamassu the first technique to also generically obtain SE subversion and updatable SNARKs. As its performance compares favorably to OCOCO, Lamassu is an attractive alternative that in contrast to OCOCO is only based on well established cryptographic assumptions.