In this paper we present a method for parameterized free space redistribution of a fragmented placement. the fragmentation problem arises in different contexts within the physical design automation, including post phy...
详细信息
In this paper we present a method for parameterized free space redistribution of a fragmented placement. the fragmentation problem arises in different contexts within the physical design automation, including post physical synthesis for filler cell insertion, incremental placement, timing optimization, and late mode ECO fix-ups. To address this problem, we apply a post-placement parameterized method of defragmentation. this method involves capturing a view of a given placement and modeling a dynamic programming problem to optimally maximize the amount of so-called useful free space as defined by a given set of parameters. the parameters act as constraints to preserve the row placement and order of the cells while minimizing the perturbation of the whole design for a successful timing and design closure. Experimental results demonstrate that by applying the proposed technique, on average, 9.7% increase in the number of inserted filler cells and 5.7% improvement in the success rate of incremental placement requests can be achieved with minimal or no impact on timing and wirelength. Moreover, when deployed in early mode buffering for timing optimization, this technique can result in 3% reduction in the number of paths with negative slacks.
Motivation - To shed light on the conceptions of human network operators about autonomic functionalities and the determinants of trust regarding future autonomic networks. Research approach - An interview study with 3...
详细信息
ISBN:
(纸本)9781450317863
Motivation - To shed light on the conceptions of human network operators about autonomic functionalities and the determinants of trust regarding future autonomic networks. Research approach - An interview study with 33 human operators at non-managerial level was performed. the interview consisted of 19 questions related to various aspects of autonomic functionalities. Findings/Design - Several components of trust and distrust for autonomic functionalities were found. Distrust based on assumed failure of the functionalities was the most prominent negative feature. the possibility to concentrate on more demanding tasks was the most surprising positive feature. Research limitations/Implications - Only 33 human operators from two telecom companies participated in the study, which limits the possible generalisation of the findings. Originality/Value - the research makes a contribution to the scarce amount of publications on human aspect in telecommunication domain. Take away message - Trust-related factors can be used for identifying and extracting general requirements for future tools from the user point of view. Copyright 2012 ACM.
As a result of supply voltage reduction and process variations effects, the error free margin for dynamic voltage scaling has been drastically reduced. this paper presents an error aware model for arithmetic and logic...
详细信息
As a result of supply voltage reduction and process variations effects, the error free margin for dynamic voltage scaling has been drastically reduced. this paper presents an error aware model for arithmetic and logic circuits that accurately and rapidly estimates the propagation delays of the output bits in a digital block operating under voltage scaling to identify circuit-level failures (timing violations) within the block. Consequently, these failure models are then used to examine how circuit-level failures affect system-level reliability. A case study consisting of a CORDIC DSP unit employing the proposed model provides tradeoffs between power, performance and reliability.
A Monte Carlo based approach capable of identifying the probability distributions that describe the delay of every sensitizable path in a path implicit manner is proposed. It is shown experimentally that the statistic...
详细信息
A Monte Carlo based approach capable of identifying the probability distributions that describe the delay of every sensitizable path in a path implicit manner is proposed. It is shown experimentally that the statistical information for all paths is generated as fast as the traditional Monte Carlo simulation that identifies the probability density function for the circuit delay.
Computation performed on stochastic bit streams is less efficient than that based on a binary radix because of its long latency. However, for certain complex arithmetic operations, computation on stochastic bit stream...
详细信息
Computation performed on stochastic bit streams is less efficient than that based on a binary radix because of its long latency. However, for certain complex arithmetic operations, computation on stochastic bit streams can consume less energy and tolerate more soft errors. In addition, the latency issue could be solved by using a faster clock frequency or in combination with a parallel processing approach. To take advantage of this computing technique, previous work proposed a combinational logic-based reconfigurable architecture to perform complex arithmetic operations on stochastic streams of bits. In this paper, we enhance and extend this reconfigurable architecture using sequential logic. Compared to the previous approach, the proposed reconfigurable architecture takes less hardware area and consumes less energy, while achieving the same performance in terms of processing time and fault-tolerance.
Semi-naturalistic research designs allow for studying behaviour in a realistic setting, achieving a fair degree of ecological validity without the disadvantages of purely naturalistic designs. A semi-naturalistic stud...
详细信息
ISBN:
(纸本)9781450317863
Semi-naturalistic research designs allow for studying behaviour in a realistic setting, achieving a fair degree of ecological validity without the disadvantages of purely naturalistic designs. A semi-naturalistic study sets boundaries for the behaviour under investigation, within which respondents still act freely. In order to allow for between-subjects comparisons, the raw data obtained must be structured, either through pre-structuring or through post-structuring. Motivation - there exists little methodological guidance in the field of documentation design. Qualitative and quantitative studies alike are carried out using an amalgam of methods that were developed for other disciplines. this paper contributes to awareness of the pitfalls (but also the benefits) of doing so. Research approach - this paper considers two seminaturalistic studies into interaction with software and documentation from a methodological point of view. In the first, the data was collected in the respondents' workplace and then poststructured. In the second, the data was collected in a laboratory setting and pre-structured through the use of an observation tool. Findings/Design - Both methods are described in some detail, followed by a discussion of methodological issues discovered after the design had been executed. Finally, the relative advantages and disadvantages of the two approaches are highlighted. Take-away message - Documentation design can fruitfully combine methodological approaches originally developed for other disciplines, provided these are adapted for the purpose with care and discretion. Copyright 2012 ACM.
Data exchange is a field of database theory that deals with transferring data between differently structured databases, with motivation coming from industry [21,17]. the starting point of intensive investigation of th...
ISBN:
(纸本)9783642332029
Data exchange is a field of database theory that deals with transferring data between differently structured databases, with motivation coming from industry [21,17]. the starting point of intensive investigation of the problem of data exchange was given in [14] where it was defined as, given data structured under a source schema and a mapping specifying how it should be translated to a target schema, to transform the source data into data structured under the target schema such that it accurately reflects the source data w.r.t. the mapping. this problem has been studied for different combinations of languages used to specify the source and target schema, and the mappings [8]. Most of the results in the literature consider tuple generating dependencies (tgds) as the language to specify mappings. Tgds allow one to express containment of conjunctive queries, and have been widely employed in other areas of database theory. Furthermore, once a target instance is materialized, one mightwant to perform query answering over it.
Stochastic arithmetic provide several benefits over traditional computing method such as high fault tolerance, simple hardware implementation, low hardware area. In order to increase accuracy of error analysis and imp...
详细信息
Stochastic arithmetic provide several benefits over traditional computing method such as high fault tolerance, simple hardware implementation, low hardware area. In order to increase accuracy of error analysis and improve method of performance evaluation for stochastic computing systems, a new variance transfer function for stochastic computing systems based on combinational logic is proposed in this work. the transfer function is proved by a new mathematical method: hypergeometric decomposition, which makes stochastic computing theory more perfect and reliable. According to the variance transfer function, several measurements based on variance are developed to evaluate performance between different stochastic computing algorithms. By comparing this method with traditional bit-level simulation method, variance measurements are proved to be less time consumption, more comprehensive, and more effective to evaluate and understand stochastic computing systems.
In modern VLSI designs, assertions play an important role to understand design intention and ensure correctness of designs. In this paper, we consider to generate assertions from simulation results. this assertion ext...
详细信息
In modern VLSI designs, assertions play an important role to understand design intention and ensure correctness of designs. In this paper, we consider to generate assertions from simulation results. this assertion extraction is performed by examining whether a logical relation is satisfied among a set of signals. We propose to accelerate it by utilizing a highly parallelized computation performed by GPGPUs. through the experiments with designs from industry, our implementation on GPGPU runs 30 times faster than a software implementation.
Hot carrier injection (HCI) effect is one of the major reliability concerns in VLSI circuits. this paper presents a scalable reliability simulation flow, including a logic cell characterization method and an efficient...
详细信息
Hot carrier injection (HCI) effect is one of the major reliability concerns in VLSI circuits. this paper presents a scalable reliability simulation flow, including a logic cell characterization method and an efficient full chip simulation method, to analyze the HCI-induced transistor aging with a fast run time and high accuracy. the transistor-level HCI effect is modeled based on the Reaction-Diffusion (R-D) framework. the gate-level HCI impact characterization method combines HSpice simulation and piecewise linear curve fitting. the proposed characterization method reveals that the HCI effect on some transistors is much more significant than the others according to the logic cell structure. Additionally, during the circuit simulation, pertinent transitions are identified and all cells in the circuit are classified into two groups: critical and non-critical. the proposed method reduces the simulation time while maintaining high accuracy by applying fine granularity simulation time steps to the critical cells and coarse granularity ones to the non-critical cells in the circuit.
暂无评论