The concept of the algorithmic complexity of crystals is developed for a particular class of minerals and inorganic materials based on orthogonal networks, which are defined as networks derived from the primitive cubi...
详细信息
The concept of the algorithmic complexity of crystals is developed for a particular class of minerals and inorganic materials based on orthogonal networks, which are defined as networks derived from the primitive cubic net (pcu) by the removal of some vertices and/or edges. Orthogonal networks are an important class of networks that dominate topologies of inorganic oxysalts, framework silicates and aluminosilicate minerals, zeolites and coordination polymers. The growth of periodic orthogonal networks may be modelled using structural automata, which are finite automata with states corresponding to vertex configurations and transition symbols corresponding to the edges linking the respective vertices. The model proposed describes possible relations between theoretical crystallography and theoretical computer science through the theory of networks and the theory of deterministic finite automata.
We propose a new concept for the organization of computing such that the number of sequential concurrent clock operations (or, the number of vector operations) is independent of the problem size n. Here, the architect...
详细信息
We propose a new concept for the organization of computing such that the number of sequential concurrent clock operations (or, the number of vector operations) is independent of the problem size n. Here, the architecture of the computing environment is adapted to the problem to be solved and the computing is performed without data exchange between the elementary processors the number of which depends on n. We describe an algorithm for implementing this idea using the problem of multiextremal optimization problem (the search for a maximum of n given numbers) and an algorithm for solving the traveling salesman problem as examples [1-4].
This paper uses techniques from formal language theory to describe the linear spatial patterns in urban freeway traffic flows in order to understand and analyze "hidden order" in such high volume systems. A ...
详细信息
This paper uses techniques from formal language theory to describe the linear spatial patterns in urban freeway traffic flows in order to understand and analyze "hidden order" in such high volume systems. A method for measuring randomness based on algorithmic entropy is introduced and developed. These concepts are operationalized using Pincus' approximate entropy formulation in an appropriate illustration. These measures, which may be viewed as counterintuitive, are believed to offer robust and rigorous guidance to enhance the overall understanding of efficiency in urban freeway traffic systems. Utilization of such measures should be facilitated by information generated by real time intelligent transportation systems (ITS) technologies and may prove helpful in real time traffic flow management.
We discuss implementations of the Adaptive Resonance Theory (ART) on a serial machine. The standard formulation of ART, which was inspired by recurrent brain structures, corresponds to a recursive algorithm. This indu...
详细信息
We discuss implementations of the Adaptive Resonance Theory (ART) on a serial machine. The standard formulation of ART, which was inspired by recurrent brain structures, corresponds to a recursive algorithm. This induces an algorithmic complexity of order O(N-2)+O(MN) in worst and average case, N being the number of categories, and M the input dimension. It is possible, however, to formulate ART in a non-recursive algorithm such that the complexity is of order O(MN) only.
We analyze the notion of quantum capacity from the perspective of algorithmic (descriptive) complexity. To this end, we resort to the concept of semi-computability in order to describe quantum states and quantum chann...
详细信息
We analyze the notion of quantum capacity from the perspective of algorithmic (descriptive) complexity. To this end, we resort to the concept of semi-computability in order to describe quantum states and quantum channel maps. We introduce algorithmic entropies (like algorithmic quantum coherent information) and derive relevant properties for them. Then we show that quantum capacity based on semi-computable concept equals the entropy rate of algorithmic coherent information, which in turn equals the standard quantum capacity. Thanks to this, we finally prove that the quantum capacity, for a given semi-computable channel, is limit computable.
algorithmic complexity (AC) vulnerabilities can be exploited to cause a denial of service attack. Specifically, an adversary can design an input to trigger excessive (space/time) resource consumption. It is not possib...
详细信息
ISBN:
(纸本)9781509038497
algorithmic complexity (AC) vulnerabilities can be exploited to cause a denial of service attack. Specifically, an adversary can design an input to trigger excessive (space/time) resource consumption. It is not possible to build a fully automated tool to detect AC vulnerabilities. Since it is an open-ended problem, a human-in-loop exploration is required to find the program loops that could have AC vulnerabilities. Ascertaining whether an arbitrary loop has an AC vulnerability is itself difficult, which is equivalent to the halting problem. This paper is about a pragmatic engineering approach to detect AC vulnerabilities. It presents a statically-informed dynamic (SID) analysis and two tools that provide critical capabilities for detecting AC vulnerabilities. The first is a static analysis tool for exploring the software to find loops as the potential candidates for AC vulnerabilities. The second is a dynamic analysis tool that can try many different inputs to evaluate the selected loops for excessive resource consumption. The two tools are built and integrated together using the interactive software analysis, transformation, and visualization capabilities provided by the Atlas platform. The paper describes two use cases for the tools, one to detect AC vulnerabilities in Java bytecode and another for students in an undergraduate algorithm class to perform experiments to learn different aspects of algorithmic complexity.
Theoretical computing is a difficult area to teach in university courses due to different causes. Many students who begin computing subjects have little mathematical or theoretical background. It is important that stu...
详细信息
ISBN:
(纸本)9781450362597
Theoretical computing is a difficult area to teach in university courses due to different causes. Many students who begin computing subjects have little mathematical or theoretical background. It is important that students acquire an intuitive knowledge of these theoretical concepts before they finish their secondary education. In this work we describe how to bring computability and complexity questions in secondary education to address classic issues raised in the curriculum about the limits of mathematics and its formal systems, and subsequently, their algorithmic and algebraic complexity. We report a complete educational experience for enhancing the algorithmic curriculum of pre-university computing and mathematics courses to know what computability and algorithmic complexity questions may be introduced into secondary education, how to teach these concepts, and train teachers to do it. The good experimental results obtained are compared with the results in standard high school courses in which these questions about theoretical computing are not addressed. The conclusions obtained are exposed, as well as the pros and cons of the educational experience carried out, so that they can be taken into account in the future design for the curriculum of an official subject in computing on a global scale, or be included in the curriculum of pre-university courses.
For pure three-qubit states the classification of entanglement is both non-trivial and well understood. In this work, we study the quantum algorithmic complexity introduced in [1] of three-qubit pure states belonging ...
详细信息
ISBN:
(纸本)9781467394888
For pure three-qubit states the classification of entanglement is both non-trivial and well understood. In this work, we study the quantum algorithmic complexity introduced in [1] of three-qubit pure states belonging to the most general class of entanglement. Contrary to expectations we find out that the degree of entanglement of states in this class quantified by the measure of 3-tangle, does not correlate with the quantum algorithmic complexity, defined as the length of the shortest circuit needed to prepare the state. For a given entangled state the evaluation of its quantum complexity is done via a pseudo-random evolutionary algorithm. This algorithm allows us not only to determine the complexity of a quantum circuit in terms of the number of required quantum gates, but also to estimate another type of complexity related to the time required to obtain the correct answer.
Algorithm complexity is an important concept in computer science concerned with the efficiency of algorithms. Understanding and improving the performance of a software system is a major concern through the lifetime of...
详细信息
ISBN:
(纸本)9789897583797
Algorithm complexity is an important concept in computer science concerned with the efficiency of algorithms. Understanding and improving the performance of a software system is a major concern through the lifetime of the system especially in the maintenance and evolution phase of any software. Identifying certain performance related issues before they actually affect the deployed system is desirable and possible if developers know the algorithmic complexity of the methods from the software system. In many software projects, information related to algorithmic complexity is missing, thus it is hard for a developer to reason about the performance of the system for different input data sizes. The goal of this paper is to propose a novel method for automatically determining algorithmic complexity based on runtime measurements. We evaluate the proposed approach on synthetic data and actual runtime measurements of several algorithms in order to assess its potential and weaknesses.
A theorem of Brudno says that the entropy production of classical ergodic information sources equals the algorithmic complexity per symbol of almost every sequence emitted by such sources. The recent advances in the t...
详细信息
暂无评论