This book provides a practically-oriented introduction to high-level programming language implementation. It demystifies what goes on within a compiler and stimulates the reader's interest in compiler design, an e...
详细信息
ISBN:
(数字)9783319527895
ISBN:
(纸本)9783319527871
This book provides a practically-oriented introduction to high-level programming language implementation. It demystifies what goes on within a compiler and stimulates the reader's interest in compiler design, an essential aspect of computer science. Programming language analysis and translation techniques are used in many software application areas.A Practical Approach to Compiler Constructioncovers the fundamental principles of the subject in an accessible way. It presents the necessary background theory and shows how it can be applied to implement complete compilers. A step-by-step approach, based on a standard compiler structure is adopted, presenting up-to-date techniques and examples. Strategies and designs are described in detail to guide the reader in implementing a translator for a programming language.A simple high-level language, loosely based on C, is used to illustrate aspects of the compilation process. Code examples in C are included, together with discussion and illustration of how this code can be extended to cover the compilation of more complex languages. Examples are also given of the use of theflexandbisoncompilerconstruction tools. Lexical and syntax analysis is covered in detail together with a comprehensive coverage of semantic analysis, intermediate representations, optimisation and code generation. Introductory material on parallelisation is also *** for personal study as well as for use in introductory undergraduate and postgraduate courses in compiler design, the author assumes that readers have a reasonable competence in programming in any high-level language.
The H. 264 Advanced Video coding (H. 264/AVC) standard is considered to be the most commonly used format for video compression. Although it supports a very broad applications range covering all forms of digital compre...
详细信息
ISBN:
(纸本)9781509043149
The H. 264 Advanced Video coding (H. 264/AVC) standard is considered to be the most commonly used format for video compression. Although it supports a very broad applications range covering all forms of digital compressed video, the confidentiality of the encoded content needs enhancements. The present work proposes a new selective encryption scheme with multiple security levels. In fact, depending on the encrypted coefficients nature, five different cryptographic scenarios are proposed. They provide an effective confidentiality and deal with the real-time processing requirements. The Hardware/Software co-simulation and the experimental results proved the efficiency of the proposed crypto-system that will be suitable for real-time video applications and resource-limited systems.
The access control problem in a hierarchy can be solved by using a hierarchical key assignment scheme, where each class is assigned an encryption key and some private information. A formal security analysis for hierar...
详细信息
The access control problem in a hierarchy can be solved by using a hierarchical key assignment scheme, where each class is assigned an encryption key and some private information. A formal security analysis for hierarchical key assignment schemes has been traditionally considered in two different settings, i.e., the unconditionally secure and the computationally secure setting, and with respect to two different notions: security against key recovery (KR-security) and security with respect to key indistinguishability (KI-security), with the latter notion being cryptographically stronger. Recently, Freire, Paterson and Poettering proposed strong key indistinguishability (SKI-security) as a new security notion in the computationally secure setting, arguing that SKI-security is strictly stronger than KI-security in such a setting. In this paper we consider the unconditionally secure setting for hierarchical key assignment schemes. In such a setting the security of the schemes is not based on specific unproven computational assumptions, i.e., it relies on the theoretical impossibility of breaking them, despite the computational power of an adversary coalition. We prove that, in this setting, SKI-security is not stronger than KI-security, i.e., the two notions are fully equivalent from an information-theoretic point of view.
This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on t...
详细信息
ISBN:
(数字)9789812877871
ISBN:
(纸本)9789812877864;9789812877871
This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on the software and the hardware level that can help to prevent them in the future. Though their theory has been known for several years now, since neither attack has yet been successfully implemented in practice, they have generally not been considered a serious threat. In short, their physical attack complexity has been overestimated and the implied security threat has been underestimated. First, the book introduces the photonic side channel, which offers not only temporal resolution, but also the highest possible spatial resolution. Due to the high cost of its initial implementation, it has not been taken seriously. The work shows both simple and differential photonic side channel analyses. Then, it presents a fault attack against pairing-based cryptography. Due to the need for at least two independent precise faults in a single pairing computation, it has not been taken seriously either. Based on these two attacks, the book demonstrates that the assessment of physical attack complexity is error-prone, and as such cryptography should not rely on it. Cryptographic technologies have to be protected against all physical attacks, whether they have already been successfully implemented or not. The development of countermeasures does not require the successful execution of an attack but can already be carried out as soon as the principle of a side channel or a fault attack is sufficiently understood.
Classical Huffman codes have a very good compression performance over traditional systems. Yet, more efficient encoding is possible by considering and applying techniques that treat the binary bits differently conside...
详细信息
ISBN:
(纸本)9781479962884
Classical Huffman codes have a very good compression performance over traditional systems. Yet, more efficient encoding is possible by considering and applying techniques that treat the binary bits differently considering requirement of storage space, energy consumption, speed of execution and so on. Future transmission systems are likely to be more efficient in many aspects. These systems will consume fewer resources to transmit or store one of the binary bits. Hence, an unequal bit cost would necessitate a different approach to producing an optimal encoding scheme. This work proposes an algorithm, which considers unequal bit-cost contribution to a message. Our experiment yields that the proposed algorithm reduces overall communication cost and improves compression ratio considerably in comparison to classical Huffman codes. This unequal bit cost technique produces a variant of Huffman Code that reduces total cost of the compressed message.
Since the mid 1990s, data hiding has been proposed as an enabling technology for securing multimedia communication, and is now used in various applications including broadcast monitoring, movie fingerprinting, stegano...
详细信息
ISBN:
(数字)9783642319716
ISBN:
(纸本)9783642319709
Since the mid 1990s, data hiding has been proposed as an enabling technology for securing multimedia communication, and is now used in various applications including broadcast monitoring, movie fingerprinting, steganography, video indexing and retrieval, and image authentication. Data hiding and cryptographic techniques are often combined to complement each other, thus triggering the development of a new research field of multimedia security. Besides, two related disciplines, steganalysis and data forensics, are increasingly attracting researchers and becoming another new research field of multimedia security. This journal, LNCS Transactions on Data Hiding and Multimedia Security, aims to be a forum for all researchers in these emerging fields, publishing both original and archival research results.;This special issue contains five selected papers that were presented at the Workshop on Pattern Recognition for IT Security, held in Darmstadt, Germany, in September 2010, in conjunction with the 32nd Annual Symposium of the German Association for Pattern Recognition, DAGM 2010. It demonstrates the broad range of security-related topics that utilize graphical data. The contributions explore the security and reliability of biometric data, the power of machine learning methods to differentiate forged images from originals, the effectiveness of modern watermark embedding schemes and the use of information fusion in steganalysis.
The move toward automatic data integration from autonomous and heterogeneous sources is viewed as a transition from a closed to an open system, which is in essence an adaptive information processing system. Data defin...
详细信息
The move toward automatic data integration from autonomous and heterogeneous sources is viewed as a transition from a closed to an open system, which is in essence an adaptive information processing system. Data definition languages from various computing eras spanning almost 50 years to date are examined, assessing if they have moved from closed systems to open systems paradigm. The study proves that contemporary data definition languages are indistinguishable from older ones using measurements of Variety, Tension and Entropy, three characteristics of complex adaptive systems (CAS). The conclusion is that even contemporary data definition languages designed for such integration exhibit closed systems characteristics along with open systems aspirations only. Plenty of good will is insufficient to make them more suitable for automatic data integration than their oldest predecessors. A previous report and these new findings set the stage for the development and proposal of a mathematically sound data definition language based on CAS, thus potentially making it better suited for automatic data integration from autonomous heterogeneous sources.
Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that woul...
详细信息
ISBN:
(数字)9781441993830
ISBN:
(纸本)9781441993823;9781489996978
Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that would enable users to pursue further work in this area. This book will also cover a brief background on Elliptic Curves and Pairings, security against chosen Cipher text Attacks, standards and more. Advanced-level students in computer science and mathematics who specialize in cryptology, and the general community of researchers in the area of cryptology and data security will find Identity-Based Encryption a useful book. Practitioners and engineers who work with real-world IBE schemes and need a proper understanding of the basic IBE techniques, will also find this book a valuable asset.
We examine data definition languages (DDLs) from various computing era spanning almost 50 years to date. We prove that contemporary DDLs are indistinguishable from older ones using Zipf distribution of words, Zipf dis...
详细信息
We examine data definition languages (DDLs) from various computing era spanning almost 50 years to date. We prove that contemporary DDLs are indistinguishable from older ones using Zipf distribution of words, Zipf distributions of meanings, and informationtheory. None addresses the Law of Requisite Variety, which is necessary for enabling automatic data integration from autonomous heterogeneous data sources and for the realization of the Semantic Web. The growth of the entire computing industry is hampered by the lack of progress in the development of DDLs suitable for these two goals. Our findings set the stage for the future development of a mathematically sound DDL better suited for the aforementioned purposes.
The LZ-index is a theoretical proposal of a lightweight data structure for text indexing, based on the Ziv-Lempel trie. If a text of u characters over an alphabet of size σ is compressible to n symbols using the LZ78...
详细信息
暂无评论