High computational cost for solving large engineering optimization problems point out the design of parallel optimization algorithms. Population based optimization algorithms provide parallel capabilities that can be ...
详细信息
High computational cost for solving large engineering optimization problems point out the design of parallel optimization algorithms. Population based optimization algorithms provide parallel capabilities that can be explored by their implementations done directly in hardware. This paper presents a hardware implementation of particle swarm optimization algorithms using an efficient floating-point arithmetic which performs the computations with high precision. All the architectures are parameterizable by bit-width, allowing the designer to choose the suitable format according to the requirements of the optimization problem. Synthesis and simulation results demonstrate that the proposed architecture achieves satisfactory results obtaining a better performance in therms of elapsed time than conventional software implementations.
A data mining framework has been proposed to estimate intracranial pressure (ICP) non-invasively in our previous work. In the corresponding approach, the feature vector extracted from arterial blood pressure (ABP) and...
详细信息
A data mining framework has been proposed to estimate intracranial pressure (ICP) non-invasively in our previous work. In the corresponding approach, the feature vector extracted from arterial blood pressure (ABP) and flow velocity (FV) is translated to the estimated errors by the mapping function for each entry in the database. In this paper, three different mapping function solutions, linear least squares (LLS), truncated singular value decomposition (TSVD) and standard Tikhonov regularization (STR) are systemically tested to compare the possible effects of different solutions on the non-invasive ICP estimation. The conducted comparison demonstrated that the selection of mapping function solution actually influences the estimation. Among the tested three solutions for mapping function, TSVD and STR show better ICP estimation performance with smaller ICP errors than LLS.
Changes of ICP waveform morphology are characterized with different patients' states like hypertension, hydrocephalus and traumatic brain injury etc. Morphological clustering and analysis of ICP pulse (MOCAIP) app...
详细信息
Changes of ICP waveform morphology are characterized with different patients' states like hypertension, hydrocephalus and traumatic brain injury etc. Morphological clustering and analysis of ICP pulse (MOCAIP) approach is recently developed to extract ICP morphology feature, in which hierarchical clustering is used to extract the dominated pulse. In this paper, we evaluate the feasibility of using principle component analysis (PCA) and independent component analysis (ICA) to extract dominated pulse. The comparative study among clustering, PCA and ICP based approaches shows that PCA approach may be an alternative of clustering approach to extract dominated pulse in a faster fashion when dataset is of large size.
There is a growing body of experimental evidence suggesting that the Ca2+ signaling in ventricular myocytes is characterized by a high gradient near the cell membrane and a more uniform Ca2+ distribution in the cell i...
详细信息
There is a growing body of experimental evidence suggesting that the Ca2+ signaling in ventricular myocytes is characterized by a high gradient near the cell membrane and a more uniform Ca2+ distribution in the cell interior [1]--[7]. An important reason for this phenomenon might be that in these cells the t-tubular system forms a network of extracellular space, extending deep into the cell interior. This allows the electrical signal, that propagates rapidly along the cell membrane, to reach the vicinity of the sarcoplasmic reticulum (SR), where intracellular Ca2+ required for myofilament activation is stored [1], [8]--[11]. Early studies of cardiac muscle showed that the t-tubules are found at intervals of about 2 lm along the longitudinal cell axis in close proximity to the Z-disks of the sarcomeres [12]. Subsequent studies have demonstrated that the t-tubular system has also longitudinal extensions [9]--[11], [13].
In this paper we present the architecture for the Personal Autonomic Desktop Manager, a self managing application designed to act on behalf of the user in several aspects: protection, healing, optimization and configu...
详细信息
ISBN:
(纸本)9780769531403
In this paper we present the architecture for the Personal Autonomic Desktop Manager, a self managing application designed to act on behalf of the user in several aspects: protection, healing, optimization and configuration. The overall goal of this research is to improve the correlation of the autonomic self* properties and doing so also enhance the overall self-management capacity of the desktop (autonomicity). We introduce the Circulatory Computing (CC) model, a self-managing system initiative based on the biological metaphor of the cardiovascular system, and use its concepts in the design and implementation of the architecture.
Tradeoffs between time complexities and solution optimalities are important when selecting algorithms for an NP-hard problem in different applications. Also, the distinction between theoretical upper bound and actual ...
详细信息
Tradeoffs between time complexities and solution optimalities are important when selecting algorithms for an NP-hard problem in different applications. Also, the distinction between theoretical upper bound and actual solution optimality for realistic instances of an NP-hard problem is a factor in selecting algorithms in practice. We consider the problem of partitioning a sequence of n distinct numbers into minimum number of monotone (increasing or decreasing) subsequences. This problem is NP-hard and the number of monotone subsequences can reach [√2n+1/1-1/2]in the worst case. We introduce a new algorithm, the modified version of the Yehuda-Fogel algorithm, that computes a solution of no more than [√2n+1/1-1/2]monotone subsequences in O(n^1.5) time. Then we perform a comparative experimental study on three algorithms, a known approximation algorithm of approximation ratio 1.71 and time complexity O(n^3), a known greedy algorithm of time complexity O(n^1.5 log n), and our new modified Yehuda-Fogel algorithm. Our results show that the solutions computed by the greedy algorithm and the modified Yehuda-Fogel algorithm are close to that computed by the approximation algorithm even though the theoretical worst-case error bounds of these two algorithms are not proved to be within a constant time of the optimal solution. Our study indicates that for practical use the greedy algorithm and the modified Yehuda-Fogel algorithm can be good choices if the running time is a major concern.
Secure provenance techniques are essential in generating trustworthy provenance records, where one is interested in protecting their integrity, confidentiality, and availability. In this work, we suggest an architectu...
详细信息
Software technologies, such as model-based testing approaches, have specific characteristics and limitations that can affect their use in software projects. To make available knowledge regarding such technologies is i...
详细信息
ISBN:
(纸本)9781605580302
Software technologies, such as model-based testing approaches, have specific characteristics and limitations that can affect their use in software projects. To make available knowledge regarding such technologies is important to support the decision regarding their use in software projects. In particular, a choice of model-based testing approach can influence testing success or failure. Therefore, this paper aims at describing knowledge acquired from a systematic review regarding model-based testing approaches and proposing an infrastructure towards supporting their selection for software projects. Copyright 2008 ACM.
Experimental studies have been used as a mechanism to acquire knowledge through a scientific approach based on measurement of phenomena in different areas. However it is hard to run such studies when they require mode...
详细信息
Experimental studies have been used as a mechanism to acquire knowledge through a scientific approach based on measurement of phenomena in different areas. However it is hard to run such studies when they require models (simulation), produce amount of information, and explore science in scale. In this case, a computerized infrastructure is necessary and constitutes a complex system to be built. In this paper we discuss an experimentation environment that has being built to support large scale experimentation and scientific knowledge management in software engineering.
We consider a scenario in which users share an access point and are mainly interested in VoIP applications. Each user is allowed to adapt to varying network conditions by choosing the transmission rate at which VoIP t...
详细信息
暂无评论