The study of the computational power of randomized computations is one of the central tasks of complexity theory. The main aim of this paper is the comparison of the power of Las Vegas computation and deterministic re...
详细信息
ISBN:
(纸本)3540626166
The study of the computational power of randomized computations is one of the central tasks of complexity theory. The main aim of this paper is the comparison of the power of Las Vegas computation and deterministic respectively nondeterministic computation. An at most polynomial gap has been established for the combinational complexity of circuits and for the communication complexity of two-party protocols. We investigate the power of Las Vegas computation for the complexity measures of one-way communication, finite automata and polynomial-time relativized Turing machine computation. (i) For the one-way communication complexity of two-party protocols we show that Las Vegas communication can save at most one half of the deterministic one-way communication complexity. We also present a language for which this gap is tight. (ii) For the size (i.e., the number of states) of finite automata we show that the size of Las Vegas finite automata recognizing a language L is at least the root of the size of the minimal deterministic finite automaton recognizing L. Using a specific language we verify the optimality of this lower bound. Note, that this result establishes for the first time an at most polynomial gap between Las Vegas and determinism for a uniform computing model. (iii) For relativized polynomial computations we show that Las Vegas can be even more powerful than nondeterminism with a polynomial restriction on the number of nondeterministic guesses. On the other hand superlogarithmic many advice bits in nondeterministic computations can be more powerful than Las Vegas (even Monte Carlo) computations in a relativized word.
This paper investigates the relation between immunity and hardness in exponential time. The idea that these concepts are related originated in computability theory where it led to Post's program. It has been conti...
详细信息
ISBN:
(纸本)3540006230
This paper investigates the relation between immunity and hardness in exponential time. The idea that these concepts are related originated in computability theory where it led to Post's program. It has been continued successfully in complexity theory [10,14,20]. We study three notions of immunity for exponential time. An infinite set A is called - EXP-immune, if it does not contain an infinite subset in EXP;- EXP-hyperimmune, if for every infinite sparse set B is an element of EXP and every polynomial p there is an x is an element of B such that {y is an element of B : p(-1) (\x\) less than or equal to \y\ less than or equal to p(\x\)} is disjoint from A;- EXP-avoiding, if the intersection A boolean AND B is finite for every sparse set B is an element of EXP. EXP-avoiding sets are always EXP-hyperimmune and EXP-hyperimmune sets are always EXP-immune but not vice versa. We analyze with respect to which polynomial-time reducibilities these sets can be hard for EXP. EXP-immune sets cannot be conjunctively hard for EXP although they can be hard for EXP with respect to disjunctive and parity-reducibility. EXP-hyperimmunes sets cannot be hard for EXP with respect to any of these three reducibilities. There is a relativized world in which there is an EXP-avoiding set which is hard with respect to positive truth-table reducibility. Furthermore, in every relativized world there is some EXP-avoiding set which is Turing-hard for EXP.
Deadlock prevention for routing messages has a central role in communication networks, since it directly influences the correctness of parallel and distributed systems. In this paper, we extend some of the computation...
详细信息
暂无评论