作者:
Gasti, WahidaLefort, ThomasLouys, MireilleTerma AS
Elektoniks and ESA ESTEC Netherlands ESA
ESTEC TOS-ETD Postbox 299 Noordwljk2200 AG Netherlands LSIIT
Université Louis Pasteur de Strasbourg and Observatoire de Strasbourg 11 Rue de l'Université Strasbourg67000 France
Progress in digital imaging sensors such as high resolution CCDs allows space instruments to perform daily observations producing up to tens of gigabytes of data. In contrast with this technology boost, the increase o...
详细信息
The detection in remotely sensed images can be conducted spatially, spectrally or both. The difficulty of detecting targets in remotely sensed images with spatial image analysis arises from the fact that the ground sa...
详细信息
The detection in remotely sensed images can be conducted spatially, spectrally or both. The difficulty of detecting targets in remotely sensed images with spatial image analysis arises from the fact that the ground sampling distance is generally larger than the size of targets of interest in which case targets are embedded in a single pixel and cannot be detected spatially. Under this circumstance target detection must be carried out at subpixel level and spectral analysis offers a valuable alternative. This paper compares two constrained approaches for subpixel detection of targets in remote sensing images. One is a target abundance-constrained approach, referred to as the nonnegatively constrained least squares (NCLS) method. It is a constrained least squares linear spectral mixture analysis method which implements a nonnegativity constraint on the abundance fractions of targets of interest. A common drawback of linear spectral mixture analysis based methods is the requirement for prior knowledge of the endmembers present in an image scene. In order to mitigate this drawback, the NCLS method is extended to create an unsupervised approach, referred to as the unsupervised nonnegatively constrained least squares (UNCLS) method. This unsupervised method can be implemented with only partial or no prior knowledge of targets present in the image scene. The second approach is a target signature-constrained method, called the constrained energy minimization (CEM) method. It constrains the desired target signature with a specific gain while minimizing effects caused by other unknown signatures. data from the HY perspectral Digital Imagery Collection Experiment (HYDICE) sensor are used to compare the performance of these methods.
The composite signal flow model of computation targets systems with significant control and dataprocessing parts. It builds on the data flow and synchronous data flow models and extends them to include three signal t...
详细信息
Elements from data fusion, optimisation and particle filtering are bought together to form the Multi-Sensor Fusion Management (MSFM) algorithm. The algorithm provides a framework for combining the information from mul...
详细信息
Elements from data fusion, optimisation and particle filtering are bought together to form the Multi-Sensor Fusion Management (MSFM) algorithm. The algorithm provides a framework for combining the information from multiple sensors and producing good solutions to the problem of how best to deploy/use these and/or other sensors to optimise some criteria in future. A problem from Anti-Submarine Warfare (ASW) is taken as an example of the potential use of the algorithm. The algorithm is shown to make efficient use of a limited supply of passive sonobouys in order to locate a submarine to the required accuracy. The results show that in the simulation the traditional strategies for sonobouy deployment required approximately four times as many sonobouys as the MSFM algorithm to achieve the required localisation.
This article presents a new model for the exploitation of the different levels of parallelism in general purpose processor based workstations in the framework of multimedia applications. It is called the GEMS model, s...
详细信息
ISBN:
(纸本)0780365364
This article presents a new model for the exploitation of the different levels of parallelism in general purpose processor based workstations in the framework of multimedia applications. It is called the GEMS model, standing for Gain, Effort, Management acid Size. It is used to study the intra-processor parallelisms including instruction level and data level parallelism, the inter-processor parallelisms ranging from shared-memory multiprocessor to distributed memory clusters and the system level parallelisms like input/output operations and the exploitation of external resources. The aim of this work is to help a programmer writing a multimedia application in choosing which level of parallelism will help to speed up the application and how to partition the application in small tasks to obtain the highest gain for a given parallelism. Indeed for an efficient implementation it is very important to have a good understanding of the architecture in order to design new algorithms or to optimize an existing application according to features available in PC workstations.
Automatic recognition of moving targets has been a topic of much recent interest lit particular researchers have been considering the exploitation of Doppler-related features resulting from moving tracks, wheels, etc....
详细信息
ISBN:
(纸本)0780365143
Automatic recognition of moving targets has been a topic of much recent interest lit particular researchers have been considering the exploitation of Doppler-related features resulting from moving tracks, wheels, etc., to improve performance. In developing and testing such algorithms, data with three or more dimensions are often used, resulting in very large data sets. In this paper;we describe a system for compressing radar data sets with the dimensions of range, Doppler;and azimuth. Our approach employs smooth localized exponentials to model the scattering returns. A best-basis approach is employed to find an optimal wavelet transformation and the resulting coefficients are entropy coded We present results showing the effectiveness of our approach and compare to a more conventional Fourier-based technique.
In this paper, a spectral interpolation coder (SIC) and decoder are investigated for simultaneous source coding and impulse noise cancellation. For simplicity of the analysis, we restrict ourselves to the framework of...
详细信息
Multimodal interface is a part of computer technologies and plays an important role in many fields. The multimodal interface, which is efficient, flexible, and convenient, supports the system of CBIR strongly. In this...
详细信息
In patients with ventricular fibrillation, some parameters of the spectral analysis of the electrocardiogram (ECG) appeared as promising tools for prediction of success of electrical defibrillation. During the past, a...
详细信息
In patients with ventricular fibrillation, some parameters of the spectral analysis of the electrocardiogram (ECG) appeared as promising tools for prediction of success of electrical defibrillation. During the past, a research work on data acquisition and analysis of ventricular fibrillation signal by means of three different systems has been performed. Two of the hardware systems are commercially available, and the third one is a custom-designed system. The systems offer various advantages and disadvantages. Electronic module based on a portable personal computer (PC) hardware and software was only suitable for clinical research. The battery powered microprocessor based data acquisition (DAQ) system has a long autonomy, small dimensions, and small weight, but it lacks mathematical capability and gives no on-line results. The custom designed DAQ and digital signal processor (DSP) module can be utilized outdoors by the emergency resuscitation team. Basically, all systems have analog input where the input signal is conditioned (filtered, amplified and limited) and converted to digital data in about 20 microseconds. Internal system random access memory (RAM) is used as temporary data storage. On the DSP based system results of the spectral analysis for 1024 points are obtained in 100 milliseconds. Final result is immediately displayed to facilitate the decision-making for the use of defibrillator.
This article discusses the digital processing methodology utilized to analyze Raman spectral data with an ultimate aim to develop a rapid and automatic system for atherosclerosis diagnosis. Different types of digital ...
详细信息
This article discusses the digital processing methodology utilized to analyze Raman spectral data with an ultimate aim to develop a rapid and automatic system for atherosclerosis diagnosis. Different types of digital and wavelet transform filters have been studied in order to reduce the CCD detector noise. After calibration, Raman spectrum has been processed by an automatic program that classifies the target tissue into pathologic or non-pathologic using pattern recognition techniques. To validate the diagnosis inferred by the automated system, a collection of 70 spectra from human coronary arteries has been tested and compared with the histological method. The processing time of whole analysis is as small as 10 milliseconds when the program is executed in a processing station based on the ADSP 61061 Sharc Digital signal Processor.
暂无评论