Digital signal processing applications mainly make use of multipliers which determine the overall performance of the system. The existing multiplier architectures are very complex and consume more time. Mitchell's...
详细信息
ISBN:
(纸本)9781509007752
Digital signal processing applications mainly make use of multipliers which determine the overall performance of the system. The existing multiplier architectures are very complex and consume more time. Mitchell's algorithm (MA) is a modest approach to compute the product using simple logarithmic operations and thus, achieving higher speed. Operand Decomposition (OD) reduces the switching activity and hence, achieves a better accuracy in fractional part calculation of logarithm. Divided Approximation (DA) and Table of Correction Values (TCV) are error correction approaches for MA which tries to follow the logarithmic curve more closely. Signed logarithmic multiplication using operand decomposition is proposed. The existing MA, DA and the proposed signed MA-OD, OD-DA, TCV are coded using Verilog HDL, simulated using ModelSim and synthesized using Xilinx XST. The simulated results are compared with respect to mean absolute error. The comparison results show that OD-DA and TCV significantly improve the accuracy of MA and found to decrease the mean absolute error of MA from around 5.36% to 1.7% and 1.33% respectively.
Physical consequences to power systems of false data injection cyber-attacks are considered. Prior work has shown that the worst-case consequences of such an attack can be determined using a bi-level optimization prob...
详细信息
ISBN:
(纸本)9781509040766
Physical consequences to power systems of false data injection cyber-attacks are considered. Prior work has shown that the worst-case consequences of such an attack can be determined using a bi-level optimization problem, wherein an attack is chosen to maximize the physical power flow on a target line subsequent to re-dispatch. This problem can be solved as a mixed-integer linear program, but it is difficult to scale to large systems due to numerical challenges. Three new computationally efficient algorithms to solve this problem are presented. These algorithms provide lower and upper bounds on the system vulnerability measured as the maximum power flow subsequent to an attack. Using these techniques, vulnerability assessments are conducted for IEEE 118-bus system and Polish system with 2383 buses.
With the continuous development of virtual reality technology, data gloves have become an important means of virtual exchange. Aiming at the defect, this paper proposes a kind of wearable controller based on inertial ...
详细信息
With the continuous development of virtual reality technology, data gloves have become an important means of virtual exchange. Aiming at the defect, this paper proposes a kind of wearable controller based on inertial sensor with gesture recognition function. The controller hardware includes inertial sensors composed by accelerometer and gyro, WIFI communication module and Microprocessor. The gesture recognition algorithm based on the extraction of acceleration and angular velocity feature, the decision tree classifier was carried out on the two kinds of similar gesture recognition, and considering the change of attitude angle to further distinguish gestures category. The experimental results show that gesture recognition average accuracy of the wearable controller is 95.98%, which can well realize the gesture recognition function.
We present techniques for decreasing the error probability of randomized algorithms and for converting randomized algorithms to deterministic (nonuniform) algorithms. Unlike most existing techniques that involve repet...
详细信息
ISBN:
(纸本)9781509039340
We present techniques for decreasing the error probability of randomized algorithms and for converting randomized algorithms to deterministic (nonuniform) algorithms. Unlike most existing techniques that involve repetition of the randomized algorithm and hence a slowdown, our techniques produce algorithms with a similar run-time to the original randomized algorithms. The amplification technique is related to a certain stochastic multi-armed bandit problem. The derandomization technique - which is the main contribution of this work - points to an intriguing connection between derandomization and sketching/sparsification. We demonstrate the techniques by showing algorithms for approximating free games (constraint satisfaction problems on dense bipartite graphs).
Serial section electron microscopy (SSEM) image stacks generated using high throughput microscopy techniques are an integral tool for investigating brain connectivity and cell morphology. FIB or 3View scanning electro...
详细信息
ISBN:
(纸本)9781467389570
Serial section electron microscopy (SSEM) image stacks generated using high throughput microscopy techniques are an integral tool for investigating brain connectivity and cell morphology. FIB or 3View scanning electron microscopes easily generate gigabytes of data. In order to produce analyzable 3D dataset from the imaged volumes, efficient and reliable image segmentation is crucial. Classical manual approaches to segmentation are time consuming and labour intensive. Semiautomatic seeded watershed segmentation algorithms, such as those implemented by ilastik image processing software, are a very powerful alternative, substantially speeding up segmentation times. We have used ilastik effectively for small EM stacks - on a laptop, no less; however, ilastik was unable to carve the large EM stacks we needed to segment because its memory requirements grew too large - even for the biggest workstations we had available. For this reason, we refactored the carving module of ilastik to scale it up to large EM stacks on large workstations, and tested its efficiency. We modified the carving module, building on existing blockwise processing functionality to process data in manageable chunks that can fit within RAM (main memory). We review this refactoring work, highlighting the software architecture, design choices, modifications, and issues encountered.
Independent Component analysis (ICA) is a statistical method for transforming an observable multi-dimensional random vector into components that are as statistically independent as possible from each other. The binary...
详细信息
ISBN:
(纸本)9781509007479
Independent Component analysis (ICA) is a statistical method for transforming an observable multi-dimensional random vector into components that are as statistically independent as possible from each other. The binary ICA (BICA) is a special case of ICA in which both the observations and the independent components are over a binary alphabet. The BICA problem has received a significant amount of attention in the past decade, mostly in the form of algorithmic approaches and heuristic solutions. However, BICA still suffers from a substantial lack of theoretical bounds and efficiency guarantees. In this work we address these concerns, as we introduce novel lower bounds and theoretical properties for the BICA problem, both under linear and non-linear transformations. In addition, we present simple algorithms which apply our methodology and achieve favorable merits, both in terms of their accuracy, and their practically optimal computational complexity.
The portability of emerging computing systems demands further reduction in the power consumption of their components. Approximate computing can reduce power consumption by using a simplified or an inaccurate circuit. ...
详细信息
The portability of emerging computing systems demands further reduction in the power consumption of their components. Approximate computing can reduce power consumption by using a simplified or an inaccurate circuit. In this paper, the energy efficiency of a finite impulse response (FIR) filter is improved through approximate computing. We propose an approximate synthesis technique for an energy-efficient FIR filter with an acceptable level of accuracy. We employ the common subexpression elimination (CSE) algorithm to implement the FIR filter and replace conventional adder/subtractors with approximate ones. While yielding acceptable rates of accuracy, the proposed flow can attain a maximum energy saving of 50.7% in comparison with conventional FIR filter designs.
The remarkable characteristic of the future smart grid is that it has a good interaction and a complete self-healing ability. It can also reflect the technical level of the whole grid and the degree of intelligence. I...
详细信息
The remarkable characteristic of the future smart grid is that it has a good interaction and a complete self-healing ability. It can also reflect the technical level of the whole grid and the degree of intelligence. Intelligent distribution network is not only the core of the future smart grid, but also the final link between the power grid and the users. It also can real-time and direct two-way interaction with the users. The self-healing ability of intelligent distribution network is the important part of the fault restoration in distribution network. This is a multi-objective, multi-time, multi-dimensional configuration optimization problem of the dimension. So we introduce the multi Agent system into the research of the intelligent distribution network self-healing system as the fault recovery strategy of the power distribution network. We also build a distribution network self-healing architecture based on multi Agent system. The feasibility of applying multi Agent system to the future smart distribution network self-healing system has been proved.
Epilepsy is a chronic neurological disorder which occurs due to the recurring evoking of seizure which results due to the abnormal rhythmic discharge of electrical activities of the brain. This fluctuation in the elec...
详细信息
ISBN:
(纸本)9781509032921
Epilepsy is a chronic neurological disorder which occurs due to the recurring evoking of seizure which results due to the abnormal rhythmic discharge of electrical activities of the brain. This fluctuation in the electrical activities of the brain can be analyzed using EEG signal which provides valuable information about the physiological states of the brain. In this paper we propose an efficient mechanism algorithm based on statistical analysis using features i.e. average value, standard deviation, variance and kurtosis along with the electrical features such as least significant value, most significant value and band power. These features were used on multichannel EEG signals which provide promising results with less complexity, simplicity along with accuracy. Thus, automatic seizure detection mechanism based on multichannel EEG signal analysis using advanced signal processing techniques with the statistical and electrical features helps to reduce the physician workload.
Artificial bee colony (ABC) algorithm is a swarm based meta-heuristic algorithm inspired by the foraging behavior of honey bees. Due to its simplicity, the ABC algorithm is used for minimax design of linear phase FIR ...
详细信息
ISBN:
(纸本)9781509028610
Artificial bee colony (ABC) algorithm is a swarm based meta-heuristic algorithm inspired by the foraging behavior of honey bees. Due to its simplicity, the ABC algorithm is used for minimax design of linear phase FIR fullband digital differentiators in this paper. Results in term of peak error obtained from designed digital differentiator examples indicate that the approach can reach smaller peak errors when compared to those obtained by the least-squares error minimization method and the optimal Parks-McClellan algorithm.
暂无评论