We give a definition for the Kolmogorov complexity of a pure quantum state. In classical information theory, the algorithmic complexity of a string is a measure of the information needed by a universal machine to repr...
详细信息
We give a definition for the Kolmogorov complexity of a pure quantum state. In classical information theory, the algorithmic complexity of a string is a measure of the information needed by a universal machine to reproduce the string itself. We define the complexity of a quantum state by means of the classical description complexity of an (abstract) experimental procedure that allows us to prepare the state with, a given fidelity. We argue that our definition satisfies the intuitive idea of complexity as a measure of "how difficult" it is to prepare a state. We apply this definition to give an upper bound on the algorithmic complexity of a number of known states. Furthermore, we establish a connection between the entanglement of a quantum state and its algorithmic complexity.
Computer-based shape grammar implementations aim to support creative design exploration by automating rule-application. This paper reviews existing shape grammar implementations in terms of their algorithmic complexit...
详细信息
Computer-based shape grammar implementations aim to support creative design exploration by automating rule-application. This paper reviews existing shape grammar implementations in terms of their algorithmic complexity, extends the definition of shape grammars with sets of transformations for rule application, categorizes (parametric and non-parametric) sets of transformations, and analyses these categories in terms of the resulting algorithmic complexity. Specifically, it describes how different sets of transformations admit different numbers of targets (i.e., potential inputs) for rule application. In the non-parametric case, this number is quadratic or cubic, while in the parametric case, it can be non-polynomial, depending on the size of the target shape. The analysis thus yields lower bounds for the algorithmic complexity of shape grammar implementations that hold independently of the employed algorithm or data structure. Based on these bounds, we propose novel matching algorithms for non-parametric and parametric shape grammar implementation and analyze their complexity. The results provide guidance for future, general-purpose shape grammar implementations for design exploration.
We analyze the notion of quantum capacity from the perspective of algorithmic (descriptive) complexity. To this end, we resort to the concept of semi-computability in order to describe quantum states and quantum chann...
详细信息
We analyze the notion of quantum capacity from the perspective of algorithmic (descriptive) complexity. To this end, we resort to the concept of semi-computability in order to describe quantum states and quantum channel maps. We introduce algorithmic entropies (like algorithmic quantum coherent information) and derive relevant properties for them. Then we show that quantum capacity based on semi-computable concept equals the entropy rate of algorithmic coherent information, which in turn equals the standard quantum capacity. Thanks to this, we finally prove that the quantum capacity, for a given semi-computable channel, is limit computable.
Idiopathic rapid eye movement sleep behavior disorder (RBD) is a serious risk factor for neurodegenerative processes such as Parkinson's disease (PD). We investigate the use of EEG algorithmic complexity derived m...
详细信息
Idiopathic rapid eye movement sleep behavior disorder (RBD) is a serious risk factor for neurodegenerative processes such as Parkinson's disease (PD). We investigate the use of EEG algorithmic complexity derived metrics for its prognosis. We analyzed resting state EEG data collected from 114 idiopathic RBD patients and 83 healthy controls in a longitudinal study forming a cohort in which several RBD patients developed PD or dementia with Lewy bodies. Multichannel data from similar to 3 min recordings was converted to spectrograms and their algorithmic complexity estimated using Lempel-Ziv-Welch compression. complexity measures and entropy rate displayed statistically significant differences between groups. Results are compared to those using the ratio of slow to fast frequency power, which they are seen to complement by displaying increased sensitivity even when using a few EEG channels. Poor prognosis in RBD appears to be associated with decreased complexity of EEG spectrograms stemming in part from frequency power imbalances and cross-frequency amplitude algorithmic coupling. algorithmic complexity metrics provide a robust, powerful and complementary way to quantify the dynamics of EEG signals in RBD with links to emerging theories of brain function stemming from algorithmic information theory.
Properties of the idempotently convex hull of a two-point set in a free semimodule over the idempotent semiring R-max min and in a free semimodule over a linearly ordered idempotent semifield are studied. Construction...
详细信息
Properties of the idempotently convex hull of a two-point set in a free semimodule over the idempotent semiring R-max min and in a free semimodule over a linearly ordered idempotent semifield are studied. Construction algorithms for this hull are proposed.
Previously, a lot of research has been done on scheduling a finite set of mixed criticality jobs with two levels of criticality on a single processor, which is also the subject of this work. It has been claimed that t...
详细信息
ISBN:
(纸本)9781450364638
Previously, a lot of research has been done on scheduling a finite set of mixed criticality jobs with two levels of criticality on a single processor, which is also the subject of this work. It has been claimed that testing the correctness of solutions for this scheduling problem can be done in polynomial time. In this paper, we give a counter example to one of the lemmas used in that proof, reopening the question on whether the scheduling problem is in class NP. Taking into account our counter example, the authors who initially proved that correctness testing can be done in polynomial time published a fix to their proof. In the past, we proved that a previously existing correctness test is applicable for a quite general class of policies. From these results, for essentially the same class of policies, in this work we derive another correctness test, which transforms the policy to a new policy having a set of time-triggered tables, one for each criticality level. We show that the two policies are equivalent, in the sense that if one successfully schedules a jobs instance then so does the other. Thus the new transformed policy can be used for testing correctness of the original policy. We show that this correctness test has a lower algorithmic complexity than the existing test, due to the fact that the testing is done on only two static tables.
This paper analyzes the algorithmic complexity (also known as Kolmogorov complexity or descriptive complexity) of the lemma corpus included in the Wortfamilienw & ouml;rterbuch der deutschen Gegenwartssprache (WfW...
详细信息
This paper analyzes the algorithmic complexity (also known as Kolmogorov complexity or descriptive complexity) of the lemma corpus included in the Wortfamilienw & ouml;rterbuch der deutschen Gegenwartssprache (WfWG;Augst 2009) as a function of its macrostructural arrangement. The results show that, compared to the alphabetical order, the WfWG word-family arrangement produces an algorithmically more compressible, and therefore less complex version of the lemma corpus. This observation points to a higher degree of learnability and cognitive accessibility of the lemma corpus arranged in word families.
As human randomness production has come to be more closely studied and used to assess executive functions (especially inhibition), many normative measures for assessing the degree to which a sequence is randomlike hav...
详细信息
As human randomness production has come to be more closely studied and used to assess executive functions (especially inhibition), many normative measures for assessing the degree to which a sequence is randomlike have been suggested. However, each of these measures focuses on one feature of randomness, leading researchers to have to use multiple measures. Although algorithmic complexity has been suggested as a means for overcoming this inconvenience, it has never been used, because standard Kolmogorov complexity is inapplicable to short strings (e. g., of length l <= 50), due to both computational and theoretical limitations. Here, we describe a novel technique (the coding theorem method) based on the calculation of a universal distribution, which yields an objective and universal measure of algorithmic complexity for short strings that approximates Kolmogorov-Chaitin complexity.
Kolmogorov-Chaitin complexity has long been believed to be impossible to approximate when it comes to short sequences (e.g. of length 5-50). However, with the newly developed coding theorem method the complexity of st...
详细信息
Kolmogorov-Chaitin complexity has long been believed to be impossible to approximate when it comes to short sequences (e.g. of length 5-50). However, with the newly developed coding theorem method the complexity of strings of length 2-11 can now be numerically estimated. We present the theoretical basis of algorithmic complexity for short strings (ACSS) and describe an R-package providing functions based on ACSS that will cover psychologists' needs and improve upon previous methods in three ways: (1) ACSS is now available not only for binary strings, but for strings based on up to 9 different symbols, (2) ACSS no longer requires time-consuming computing, and (3) a new approach based on ACSS gives access to an estimation of the complexity of strings of any length. Finally, three illustrative examples show how these tools can be applied to psychology.
The concept of the algorithmic complexity of crystals is developed for a particular class of minerals and inorganic materials based on orthogonal networks, which are defined as networks derived from the primitive cubi...
详细信息
The concept of the algorithmic complexity of crystals is developed for a particular class of minerals and inorganic materials based on orthogonal networks, which are defined as networks derived from the primitive cubic net (pcu) by the removal of some vertices and/or edges. Orthogonal networks are an important class of networks that dominate topologies of inorganic oxysalts, framework silicates and aluminosilicate minerals, zeolites and coordination polymers. The growth of periodic orthogonal networks may be modelled using structural automata, which are finite automata with states corresponding to vertex configurations and transition symbols corresponding to the edges linking the respective vertices. The model proposed describes possible relations between theoretical crystallography and theoretical computer science through the theory of networks and the theory of deterministic finite automata.
暂无评论