Self-healing key distribution schemes allow group managers to broadcast session keys to dynamic groups of users over unreliable channels. The main property of the scheme is that, if during a certain session some broad...
详细信息
Self-healing key distribution schemes allow group managers to broadcast session keys to dynamic groups of users over unreliable channels. The main property of the scheme is that, if during a certain session some broadcasted packet gets lost, then users are still capable of recovering the session key for that session simply by using the packets they have received during a previous session and the packets they receive at the beginning of a subsequent one, without requesting additional transmission from the group manager. Such schemes are quite suitable in supporting secure communication in wireless networks and mobile wireless ad-hoc networks. In this paper we present a lower bound on the randomness required for implementing self-healing key distribution schemes. We also show that the lower bound in tight by describing a self-healing scheme meeting it
We address authenticated encryption, the cryptographic technique that simultaneously provides both confidentiality and authenticity, and steganography that hides the very existence of messages. We focus on the securit...
详细信息
We address authenticated encryption, the cryptographic technique that simultaneously provides both confidentiality and authenticity, and steganography that hides the very existence of messages. We focus on the security of these techniques against adversaries with unbounded computational resources. First, we reveal the strong security notion of unconditionally secure authenticated encryption and show how to achieve it. Second, we study unconditionally secure stegosystems under active attacks and show how to construct them by the above mentioned unconditionally secure authenticated encryption schemes
Randomness extractors (Nisan and Zuckerman, 1996) allow one to obtain nearly perfect randomness from highly imperfect sources randomness, which are only known to contain "scattered" entropy. Not surprisingly...
详细信息
Randomness extractors (Nisan and Zuckerman, 1996) allow one to obtain nearly perfect randomness from highly imperfect sources randomness, which are only known to contain "scattered" entropy. Not surprisingly, such extractors have found numerous applications in many areas of computer science including cryptography. Aside from extracting randomness, a less known usage of extractors comes from the fact that they hide all deterministic functions of their (high-entropy) input (Dodis and Smith, 2005): in other words, extractors provide certain level of privacy for the imperfect source that they use. In the latter kind of applications, one typically needs extra properties of extractors, such as invertibility, collision-resistance or error-correction. In this abstract we survey some of such usages of extractors, concentrating on several recent results by the author (Dodis et al., 2004 and Dodis and Smith, 2005). The primitives we survey include several flavors of randomness extractors, entropically secure encryption and perfect one-way hash functions. The main technical tools include several variants of the leftover hash lemma, error correcting codes, and the connection between randomness extraction and hiding all partial information. Due to space constraints, many important references and results are not mentioned here; interested reader can find those in the works of Dodis et al. (2004) and Dodis and Smith (2005)
Long-term security is achieved by protocols which use computational assumptions only during the execution of the protocol and become information theoretically secure afterwards. Coin flipping protocols and zero knowle...
详细信息
Long-term security is achieved by protocols which use computational assumptions only during the execution of the protocol and become information theoretically secure afterwards. Coin flipping protocols and zero knowledge arguments are examples for protocols achieving long-term security. In this work we consider this regime between computational security and information theoretic security. A security model for long-term security is sketched and the class K of all two argument functions which can be computed with long-term security is characterised. Furthermore it is shown that the class Q of all two argument functions which can be computed using quantum cryptography is strictly contained in K. The characterisation of K is a generalisation of a result of Kushilevitz (1992), where he characterises the two argument functions which can securely be computed in presence of an unbounded passive adversary. The result in the quantum case additionally relies on the impossibility results of (D. Mayers, 1997) , (H.-K. Lo and H. R Chau, 1996) and the impossibility of quantum coin flipping result of Kitaev which is published in (A. Ambainis et al., 2004)
Encryption is a fundamental building block for computer and communications technologies. Existing encryption methods depend for their security on unproven assumptions. We propose a new model, the limited access model ...
详细信息
Encryption is a fundamental building block for computer and communications technologies. Existing encryption methods depend for their security on unproven assumptions. We propose a new model, the limited access model for enabling a simple and practical provably unbreakable encryption scheme. A voluntary network of tens of thousands of computers each maintain and update random pages, and act as page server nodes. A sender and receiver share a random key K. They use K to randomly select the same PSNs and download the same random pages. These are employed in groups of say 30 pages to extract one time pads common to S and R. Under reasonable assumptions of an adversary's inability to monitor all PSNs, and easy ways for S and R to evade monitoring while downloading pages, hyper encryption is clearly unbreakable. The system has been completely implemented
While searching documents on the Web users often refine their query terms, to generate new result lists and explore different pages, thus they end up with several result sets. Experience suggests that users compare th...
详细信息
While searching documents on the Web users often refine their query terms, to generate new result lists and explore different pages, thus they end up with several result sets. Experience suggests that users compare these different sets, either implicitly or explicitly. Comparison between these sets (a) allows users to observe similarities and differences that are not innately viewable by viewing the sets separately, (b) reduces the cognitive effort of switching from one result set to another and (c) enables them to browse more effectively. In this paper we investigate comparison visualization and describe a prototype search-engine similarity tool (SES), which visualizes the textual difference of multiple Web searches using a combination of multiple views and visual bracketing
This paper demonstrates a software and hardware implementation for creating and visualising a three dimensional voxel based model for any arbitrary polynomial function. Preliminary results demonstrate its use from the...
详细信息
This paper demonstrates a software and hardware implementation for creating and visualising a three dimensional voxel based model for any arbitrary polynomial function. Preliminary results demonstrate its use from the model creation, within a cluster computing engine, to a large real-time visualisation system via a small PC rendering farm, have been carried out. This includes the choice of data representation and transmission across a university campus. A fully automated and on-demand model modification process is required in the next phase of implementation, as well as improvements in each of the stages of computation, translation, visualisation and human interaction
Vascular surgery is a technically demanding surgical speciality, one component of which is the accurate placement of sutures through a diseased vessel wall. Minor errors can result in thrombosis and failure of the pro...
详细信息
Vascular surgery is a technically demanding surgical speciality, one component of which is the accurate placement of sutures through a diseased vessel wall. Minor errors can result in thrombosis and failure of the procedure. To develop the necessary skills takes many hours of practice, which, in the past, have been acquired at the operating table. Recreating a surgical environment using virtual tools presents a number of research challenges. Conventional collision detection methods fail for deformable bodies and do not provide a mechanism to scale the response over different regions at the same time. We have developed a threaded collision test allowing the mesh to be updated whilst guaranteeing a smooth force response for haptic devices. The finite element method (FEM) is adapted to allow multiple points of contact and the validity of this model is discussed with respect to known tissue behaviour. Following a preliminary examination by clinicians, a novel scheme is introduced for simulating more realistic force responses
This paper examines a virtual prototyping system for electronic devices which incorporate visualisation using a novel integrated development environment that combines user interaction with photorealistic 2D and 3D mod...
详细信息
This paper examines a virtual prototyping system for electronic devices which incorporate visualisation using a novel integrated development environment that combines user interaction with photorealistic 2D and 3D models. Full system level hardware simulation is also supported within this framework which offers electronic simulation in a virtual environment. This helps to link product development specialists with a unified and coherent modelling environment. Virtual prototyping is a novel design methodology that aims to decrease the time-to-market and increase product reliability, quality and fulfilment of user requirements. This paper uses the example of a remotely controlled domestic cooking system to illustrate this process
暂无评论