On the Limits of Computational Fuzzy Extractors, by Kenji Yasunaga and Kosuke Yuzawa
Fuller et.~al (Asiacrypt 2013) studied on computational fuzzy extractors,
and showed, as a negative result, that the existence of a computational ``secure sketch\'\'
implies the existence of an information-theoretically secure sketch with slightly weaker parameters.
In this work, we show a similar negative result such that, under some computational assumption,
the existence of a computational fuzzy extractor also implies the existence of
an information-theoretic fuzzy extractor with slightly weaker parameters.
The assumption is that the generation procedure of the fuzzy extractor can be efficiently invertible.
This result implies that to circumvent the limitations of information-theoretic fuzzy extractors,
we need to employ computational fuzzy extractors in which the generation procedure cannot be efficiently invertible.
A Multi-Function Provable Data Possession Scheme in Cloud Computing, by Xiaojun Yu and Qiaoyan Wen
In order to satisfy the different requirements of provable data possession in cloud computing, a multi-function provable data possession (MF-PDP) is proposed, which supports public verification, data dynamic, unlimited times verification, sampling verification. Besides, it is security in RO model and it is verification privacy under half trust model and can prevent from replacing attack and replay attack. The detail design is provided and the theory analysis
about the correct, security and performance are also described. The experiment emulation and compare analysis suggest the feasibility and advantage.
Adding Controllable Linkability to Pairing-Based Group Signatures For Free, by Daniel Slamanig and Raphael Spreitzer and Thomas Unterluggauer
Group signatures, which allow users of a group to anonymously produce signatures on behalf of the group, are an important cryptographic primitive for privacy-enhancing applications. Over the years, various approaches to enhanced anonymity management mechanisms, which extend the standard feature of opening of group signatures, have been proposed.
In this paper we show how pairing-based group signature schemes (PB-GSSs) based on the sign-and-encrypt-and-prove (SEP) paradigm can be generically transformed in order to support one particular enhanced anonymity management mechanism, i.e., we propose a transformation that turns every such PB-GSS into a PB-GSS with controllable linkability. Basically, this transformation replaces the public key encryption scheme used for identity escrow within a group signature scheme with a modified all-or-nothing public key encryption with equality tests scheme (denoted AoN-PKEET$^*$) instantiated from the respective public key encryption scheme. Thereby, the respective trapdoor is given to the linking authority as a linking key. The appealing benefit of this approach in contrast to other anonymity management mechanisms (such as those provided by traceable signatures) is that controllable linkability can be added to PB-GSSs based on the SEP paradigm for free, i.e., it neither influences the signature size nor the computational costs for signers and verifiers in comparison to the scheme without this feature.
DTKI: a new formalized PKI with no trusted parties, by Jiangshan Yu and Vincent Cheval and Mark Ryan
The security of public key validation protocols for web-based applications has recently attracted attention because of weaknesses in the certificate authority model, and consequent attacks.
Recent proposals using public logs have succeeded in making certificate management more transparent and verifiable. How- ever, those proposals involve a fixed set of authorities which create a monopoly, and they have heavy reliance on trusted parties that monitor the logs.
We propose a distributed transparent key infrastructure (DTKI), which greatly reduces the monopoly of service providers and removes the reliance on trusted parties. In addition, this paper formalises the public log data structure and provides a formal analysis of the security that DTKI guarantees.
Adaptive versus Static Security in the UC Model, by Ivan Damgård and Jesper Buus Nielsen
We show that for certain class of unconditionally secure protocols and
target functionalities, static security implies adaptive security in the UC
model. Similar results were previously only known for models with
weaker security and/or composition guarantees. The result is, for
instance, applicable to a wide range of protocols based on secret
sharing. It ``explains\'\' why an often used proof technique for such
protocols works, namely where the simulator runs in its head a copy of
the honest players using dummy inputs and generates a protocol
execution by letting the dummy players interact with the
adversary. When a new player $P_i$ is corrupted, the simulator
adjusts the state of its dummy copy of $P_i$ to be consistent with
the real inputs and outputs of $P_i$ and gives the state to the
adversary. Our result gives a characterisation of the cases where this
idea will work to prove adaptive security. As a special case,
we use our framework to give the first proof of adaptive security
of the seminal BGW protocol in the UC framework.
A Cryptographic Study of Tokenization Systems, by Sandra D\\\'iaz-Santiago and Lil Mar\\\'ia Rodr\\\'iguez-Henr\\\'iquez and Debrup Chakraborty
Payments through cards have become very popular in today\'s world. All businesses now have options to receive payments through this instrument, moreover most organizations store card information of its customers in
some way to enable easy payments in future. Credit card data is a very sensitive information and theft of this data is a serious threat to any company. Any organization that stores credit card data needs to achieve payment card industry (PCI) compliance, which is an intricate process where the organization needs to demonstrate that the data it stores is safe. Recently there has been a paradigm shift in treatment of the problem of storage of payment card information. In this new paradigm instead of the real credit card data a token is stored, this process is called ``tokenization\". The token resembles the
credit/debit card number but is in no way related to it. This solution relieves the merchant from the burden of PCI compliance in several ways.
Though tokenization systems are heavily in use, to our knowledge, a formal cryptographic study of this problem has not yet been done. In this paper we initiate a study in this direction. We formally define the syntax of a tokenization system, and several notions of security for such systems. Finally, we provide some constructions of tokenizers and analyze their security in the light of our definitions.