We investigated the performance, through simulations, of multi-channel digital back-propagation-based nonlinearity compensation (NLC) for digital Nyquist WDM transmission over uncompensated standard SMF links. We also...
详细信息
ISBN:
(纸本)9781467362740
We investigated the performance, through simulations, of multi-channel digital back-propagation-based nonlinearity compensation (NLC) for digital Nyquist WDM transmission over uncompensated standard SMF links. We also review recently developed approaches to reduce NLC algorithm complexity.
In this paper new and efficient BIST methodology and BIST hardware insertion algorithms are presented for RTL data paths obtained from high level synthesis. The methodology is based on concurrent testing of modules wi...
详细信息
ISBN:
(纸本)0769500781
In this paper new and efficient BIST methodology and BIST hardware insertion algorithms are presented for RTL data paths obtained from high level synthesis. The methodology is based on concurrent testing of modules with identical physical information by sharing the test pattern generators in a partial intrusion BIST environment. Furthermore, to reduce the number of signature analysis registers and rest application time the same type modules are grouped in test compatibility classes and n-inprrt k-bit comparators are used to check the results. The test application time is completed using an incremental test scheduling approach. An existing test scheduling algorithm is modified to obtain an efficient trade-off between the algorithm complexity and testable design space exploration. A cost function based on both test application rime and area overhead is defined and a tabu search-based heuristic capable of exploring the solution space in a very rapid rime is presented. To reduce the computational time testable design space exploration is carried out in two phases: test application time reduction phase and BIST area reduction phase. Experimental results are included confirming the efficiency of the proposed methodology.
Traditional profilers identify where a program spends most of its resources. They do not provide information about why the program spends those resources or about how resource consumption would change for different pr...
详细信息
ISBN:
(纸本)9781450312059
Traditional profilers identify where a program spends most of its resources. They do not provide information about why the program spends those resources or about how resource consumption would change for different program inputs. In this paper we introduce the idea of algorithmic profiling. While a traditional profiler determines a set of measured cost values, an algorithmic profiler determines a cost function. It does that by automatically determining the "inputs" of a program, by measuring the program's "cost" for any given input, and by inferring an empirical cost function.
In this paper we have studied the performance of rate. 1/2 convolutional encoders with adaptive states developed in chaotic and hyperchaotic regions. These states are generated by varying the control parameters in a f...
详细信息
In this paper we have studied the performance of rate. 1/2 convolutional encoders with adaptive states developed in chaotic and hyperchaotic regions. These states are generated by varying the control parameters in a feedback. controlled system. Several sets of closed. loop simulations are performed to demonstrate the benefit of information. based chaos system. In particular, it is demonstrated that two varieties of an information. based systems provide improved performance over all the encoder choices when hyperchaos states are utilized. Special attention was paid to the algorithmic complexity of the systems for an entire class of rate. 1/2 encoders. The decoder is able to recover the encrypted data and is able to reasonably estimate the bit error rate for different signal strengths under a noisy AWGN channel. This indicates that the encoder can update the information map in real time to compensate for changing data for both chaotic and hyperchaotic states. This is the evidence that occasional changes in the data stream can be handled by the decoder in a real time application. Numerical evidence indicates algorithmic complexity associated with the hyperchaotic. encrypted and convolutionally-encoded data, provide better security along with the increase in the error correcting capacity of the decoder. (C) 2010 Published by Elsevier B.V.
Grigoriev and Podolskii (2018) have established a tropical analog of the effective Nullstellensatz, showing that a system of tropical polynomial equations is solvable if and only if a linearized system obtained from a...
详细信息
ISBN:
(纸本)9798400700392
Grigoriev and Podolskii (2018) have established a tropical analog of the effective Nullstellensatz, showing that a system of tropical polynomial equations is solvable if and only if a linearized system obtained from a truncated Macaulay matrix is solvable. They provided an upper bound of the minimal admissible truncation degree, as a function of the degrees of the tropical polynomials. We establish a tropical nullstellensatz adapted to sparse tropical polynomial systems. Our approach is inspired by a construction of Canny-Emiris (1993), refined by Sturmfels (1994). This leads to an improved bound of the truncation degree, which coincides with the classical Macaulay degree in the case of n + 1 equations in.. unknowns. We also establish a tropical positivstellensatz, allowing one to decide the inclusion of tropical basic semialgebraic sets. This allows one to reduce decision problems for tropical semi-algebraic sets to the solution of systems of tropical linear equalities and inequalities. The later systems are known to be reducible to mean payoff games, which can be solved in practice, in a scalable way, by value iteration methods. We illustrate this approach by examples.
In smart manufacturing, data management systems are built with a multi-layer architecture, in which the most significant layers are the edge and the cloud. The edge layer renders support to data analysis that genuinel...
详细信息
ISBN:
(纸本)9781665460873
In smart manufacturing, data management systems are built with a multi-layer architecture, in which the most significant layers are the edge and the cloud. The edge layer renders support to data analysis that genuinely demands low latency. Cloud platforms store vast amounts of data while performing extensive computations such as machine learning and big data analysis. This type of data management system has a limitation rooted in the fact that all data needs to be transferred from the equipment layer to the edge layer in order to perform all data analyses. Even worse, data transferring adds delays to computation processes in smart manufacturing. We investigate an offloading strategy to shift a selection of computation tasks towards the equipment layer. Our computation offloading mechanism opts for smart manufacturing tasks that are not only light weight but also have no need to save data at the edge/cloud end. In our empirical study, we demonstrate that the edge layer can judiciously offload computing tasks to the equipment layer, which curtails computing latency and slashes the amount of transferred data during smart manufacturing process. Our experimental results confirm that our offloading strategy offers the capability for data analysis computing in real-time at the equipment level - an array of smart devices is slated to speed up the data analysis process in semiconductor manufacturing.
We develop an analogue of eigenvalue methods to construct solutions of systems of tropical polynomial equalities and inequalities. We show that solutions can be obtained by solving parametric mean payoff games, arisin...
详细信息
ISBN:
(纸本)9783031645280;9783031645297
We develop an analogue of eigenvalue methods to construct solutions of systems of tropical polynomial equalities and inequalities. We show that solutions can be obtained by solving parametric mean payoff games, arising to approriate linearizations of the systems using tropical Macaulay matrices. We implemented specific algorithms adapted to the large scale parametric games that arise in this way, and present numerical experiments.
Belief in conspiracy theories has often been associated with a biased perception of randomness, akin to a nothing-happens-by-accident heuristic. Indeed, a low prior for randomness (i.e., believing that randomness is a...
详细信息
Belief in conspiracy theories has often been associated with a biased perception of randomness, akin to a nothing-happens-by-accident heuristic. Indeed, a low prior for randomness (i.e., believing that randomness is a priori unlikely) could plausibly explain the tendency to believe that a planned deception lies behind many events, as well as the tendency to perceive meaningful information in scattered and irrelevant details;both of these tendencies are traits diagnostic of conspiracist ideation. In three studies, we investigated this hypothesis and failed to find the predicted association between low prior for randomness and conspiracist ideation, even when randomness was explicitly opposed to malevolent human intervention. Conspiracy believers' and nonbelievers' perceptions of randomness were not only indistinguishable from each other but also accurate compared with the normative view arising from the algorithmic information framework. Thus, the motto nothing happens by accident, taken at face value, does not explain belief in conspiracy theories.
Whether and how well people can behave randomly is of interest in many areas of psychological research. The ability to generate randomness is often investigated using random number generation (RNG) tasks, in which par...
详细信息
Whether and how well people can behave randomly is of interest in many areas of psychological research. The ability to generate randomness is often investigated using random number generation (RNG) tasks, in which participants are asked to generate a sequence of numbers that is as random as possible. However, there is no consensus on how best to quantify the randomness of responses in human-generated sequences. Traditionally, psychologists have used measures of randomness that directly assess specific features of human behavior in RNG tasks, such as the tendency to avoid repetition or to systematically generate numbers that have not been generated in the recent choice history, a behavior known as cycling. Other disciplines have proposed measures of randomness that are based on a more rigorous mathematical foundation and are less restricted to specific features of randomness, such as algorithmic complexity. More recently, variants of these measures have been proposed to assess systematic patterns in short sequences. We report the first large-scale integrative study to compare measures of specific aspects of randomness with entropy-derived measures based on information theory and measures based on algorithmic complexity. We compare the ability of the different measures to discriminate between human-generated sequences and truly random sequences based on atmospheric noise, and provide a systematic analysis of how the usefulness of randomness measures is affected by sequence length. We conclude with recommendations that can guide the selection of appropriate measures of randomness in psychological research.
Propositional knowledge base revision is intractable in the general case. This paper proposed an algorithm for proportional knowledge base revision. It is shown that the computational complexity is connected with the ...
详细信息
ISBN:
(纸本)0780370805
Propositional knowledge base revision is intractable in the general case. This paper proposed an algorithm for proportional knowledge base revision. It is shown that the computational complexity is connected with the relationship between the number of formulas and the number of variables. A sufficient condition is provided and proved to guarantee the algorithm is polynomial.
暂无评论