Automatic Vehicle License Plate Detection System (AVLPDS) is the extraction of vehicle license plate information from an image. Besides the safety aspects this system is used in many applications, viz. electronic paym...
详细信息
ISBN:
(纸本)9781509020287
Automatic Vehicle License Plate Detection System (AVLPDS) is the extraction of vehicle license plate information from an image. Besides the safety aspects this system is used in many applications, viz. electronic payment systems, freeway, arterial monitoring systems for traffic surveillance etc. The purpose of this paper is to present the FPGA algorithmic model of most efficient algorithm among three algorithms: Edge-based, Connected-Component based and Histogram based. Each approach is analyzed on the basis of precision and recall rates to determine the success of each approach. After comparison, we can say Histogram based approach has an advantage of being simple and thus faster. Therefore, in this paper, we have used Histogram based Edge processing approach to detect the license plate and presented the FPGA implementation of AVLPDS for the same. The whole system is implemented using MATLAB Simulink and Xilinx System Generator(XSG). Use of XSG for imageprocessing effectively reduces the complexity in structural design and contributes for hardware co-simulation. The accuracy of the algorithm is checked for different sets of input images and significant performance improvement has been found, thereby performing an optimal FPGA-based hardware implementation of AVLPDS.
An unprecedented growth in data generation is taking place. Data about larger dynamic systems is being accumulated, capturing finer granularity events, and thus processing requirements are increasingly approaching rea...
详细信息
ISBN:
(纸本)9781467388153
An unprecedented growth in data generation is taking place. Data about larger dynamic systems is being accumulated, capturing finer granularity events, and thus processing requirements are increasingly approaching real-time. To keep up, data-analytics pipelines need to be viable at massive scale, and switch away from static, offline scenarios to support fully online analysis of dynamic systems. This paper uses a challenge problem, graph colouring, to explore massive-scale analytics for dynamic graph processing. We present an event-based infrastructure, and a novel, online, distributed graph colouring algorithm. Our implementation for colouring static graphs, used as a performance baseline, is up to an order of magnitude faster than previous results and handles massive graphs with over 257 billion edges. Our framework supports dynamic graph colouring with performance at large scale better than GraphLab's static analysis. Our experience indicates that online solutions are feasible, and can be more efficient than those based on snapshotting.
Feature of modern infocommunication systems is expeditious exchange of information that makes a problem of ensuring quality and reliability of the obtained information actual. For elimination or minimization of the de...
详细信息
ISBN:
(纸本)9781509040490
Feature of modern infocommunication systems is expeditious exchange of information that makes a problem of ensuring quality and reliability of the obtained information actual. For elimination or minimization of the destabilizing impact of noise and hindrances in such systems various methods and algorithms of preliminary information processing are widely used, in particular, procedures of digital filtration of signals and images. Procedure of creation of the nonlinear SVD filter with adaptation to local properties of an observed signal is stated. Comparative examples of filtration of hindrances in a problem of processing of images are given, efficiency of the offered method is shown. The lines of further researches are defined.
In recent years, development of the computer-aided diagnosis (CAD) systems for the purpose of reducing the false positive on visual screening and improving accuracy of lesion detection has been advanced. Lung cancer i...
详细信息
ISBN:
(纸本)9788993215120
In recent years, development of the computer-aided diagnosis (CAD) systems for the purpose of reducing the false positive on visual screening and improving accuracy of lesion detection has been advanced. Lung cancer is the leading cause of cancer death in the world. Among them, GGO (Ground Glass Opacity) that exhibited early in the before cancer lesion and carcinoma in situ shows a pale concentration, have been concerned about the possibility of undetected on the screening. In this paper, we propose an automatic extraction method of GGO candidate regions from the chest CT image. Our proposed imageprocessingalgorithms is consist of four main steps;1) segmentation of volume of interest from the chest CT image and removing the blood vessel regions, bronchus regions based on 3D line filter, 2) first detection of GGO regions based on density and gradient which is selected the initial GGO candidate regions, 3) identification of the final GGO candidate regions based on DCNN (Deep Convolutional Neural Network) algorithms. Finally, we calculates the statistical features for reducing the false-positive (FP) shadow by the rule-based method, performs identification of the final GGO candidate regions by SVM (Support Vector Machine). Our proposed method performed on to the 31 cases of the LIDC (Lung image Database Consortium) database, and final identification performance of TP: 93.02[%], FP: 128.52[/case] are obtained respectively.
To replicate classical celestial navigation algorithms with modern technologies, a hemispherical camera array is used to capture panoramic images of the sun for the purpose of obtaining azimuth and zenith angles. A no...
详细信息
To replicate classical celestial navigation algorithms with modern technologies, a hemispherical camera array is used to capture panoramic images of the sun for the purpose of obtaining azimuth and zenith angles. A novel geolocation algorithm is presented to use these angles to determine the absolute location without aid of satellites. It derives both the geolocation estimates and the error intervals based on measurement noise levels. Hough Transform, charge-coupled device blooming, and interval analysis are employed to improve accuracy even with lower quality sensors. The proofs of our theoretical results based on an interval analysis handle arbitrary celestial references. Calibration of and geolocation from our experimental prototype demonstrates the feasibility and potential of our system. We obtain geolocation to within a rectangular area of 0.533 km x 1.6 km using an array of inexpensive cameras with 640 x 480 pixels resolution. Our simulation result shows that the improvement of sensor precision with technology already available can reduce the area of uncertainty to less than 200 m(2).
We introduce a new machine learning approach for image segmentation that uses a neural network to model the conditional energy of a segmentation given an image. Our approach, combinatorial energy learning for image se...
详细信息
In this study, we have designed a GPGPU (General-Purpose Graphics processing Unit)-based algorithm for determining the minimum distance from the tip of a CUSA (Cavitron Ultrasonic Surgical Aspirator) scalpel to the cl...
详细信息
The huge growth in the smartphones market has led to the improvement in ARM architectures, so that today there are many devices based on them, with sufficient capacity to deal with certain imageprocessing application...
详细信息
ISBN:
(纸本)9781509013142
The huge growth in the smartphones market has led to the improvement in ARM architectures, so that today there are many devices based on them, with sufficient capacity to deal with certain imageprocessing applications. These are used in VSN (Visual Sensor Networks) and in other scenarios, where the energy used in the process is a parameter to be considered. In these scenarios, and with the HMP (Heterogeneous multiprocessing) architectures on which the current platforms are based, it is necessary to determine both the core as well as the working frequency in order to achieve the application's imageprocessing objectives with the least possible energy consumption. In this type of system, the workload is oriented towards interactive applications, so the priorities are different to those of VSN type systems. Conventional regulation algorithms do not make appropriate use of the processor's frequency range, in relation to the load. This paper analyses the influence of the maximum frequency value on system performance and power consumption.
Cryo-electron microscopy (cryo-EM) is a three-dimensional (3D) averaging technique that makes use of two-dimensional (2D) images of biological macromolecules preserved in a thin layer of vitreous ice. Recent advances ...
详细信息
Cryo-electron microscopy (cryo-EM) is a three-dimensional (3D) averaging technique that makes use of two-dimensional (2D) images of biological macromolecules preserved in a thin layer of vitreous ice. Recent advances in the field have facilitated the evolution of cryo-EM towards atomic resolution, and the technique provides 3D maps with detailed description of biological macromolecules. Data acquisition at the transmission electron microscope (TEM) is the first crucial step during the single-particle analysis workflow in cryo-EM. In order to exploit the potential of this structural technique for atomic or near atomic resolution, the initial collection must allow recording of large datasets and, hence, requires operating the TEM in automated mode. The quality of the acquired dataset relies, however, on the expertise of researchers and unsupervised operations might result in low data quality. This work presents the first expert system integrated in a novel scheme to automate cryo-EM data acquisition in a TEM. This development takes advantage of fuzzy logic systems to integrate the working mode of an expert in a linguistic manner and to learn from acquired data through an adaptive network. A new method based on different image-processingalgorithms and on adaptive neuro-fuzzy inference systems (ANFIS) identifies, in an unsupervised manner, the single-particles present in cryo-EM images during the automated acquisition on a TEM. This single-particle identification system is integrated in a new intelligent control scheme to automate cryo-EM data acquisition. A classic fuzzy inference system (FIS) was programmed to make appropriate decisions during the session. The designed system can be trained for a specific sample and allows for unsupervised but efficient data collection imitating the working mode of an experienced microscopist. (C) 2016 Elsevier Ltd. All rights reserved.
It is a trend now that computing power through parallelism is provided by multi-core systems or heterogeneous architectures for High Performance Computing (HPC) and scientific computing. Although many algorithms have ...
详细信息
ISBN:
(纸本)9781509052523
It is a trend now that computing power through parallelism is provided by multi-core systems or heterogeneous architectures for High Performance Computing (HPC) and scientific computing. Although many algorithms have been proposed and implemented using sequential computing, alternative parallel solutions provide more suitable and high performance solutions to the same problems. In this paper, three parallelization strategies are proposed and implemented for a dynamic programming based cloud smoothing application, using both shared memory and non-shared memory approaches. The experiments are performed on NVIDIA GeForce GT750m and Tesla K20m, two GPU accelerators of Kepler architecture. Detailed performance analysis is presented on partition granularity at block and thread levels, memory access efficiency and computational complexity. The evaluations described show high approximation of results with high efficiency in the parallel implementations, and these strategies can be adopted in similar data analysis and processing applications.
暂无评论