We present a weighting scheme for local weighted regression designed to achieve tow goals: (1) to reduce noise within image regions of smoothly varying intensities; and (2) to maintain sharp boundaries between image r...
详细信息
ISBN:
(纸本)0819422118;9780819422118
We present a weighting scheme for local weighted regression designed to achieve tow goals: (1) to reduce noise within image regions of smoothly varying intensities; and (2) to maintain sharp boundaries between image regions. Such a procedure can function as a preprocessing step in an image segmentation problem or simply as an image enhancement technique.
A normalization algorithm is proposed that improves the reconstructions of signals. After decomposing a signal in even linear bandpass filtered signals and a low-pass residual, it can be reconstructed reasonable well ...
详细信息
ISBN:
(纸本)0819422118;9780819422118
A normalization algorithm is proposed that improves the reconstructions of signals. After decomposing a signal in even linear bandpass filtered signals and a low-pass residual, it can be reconstructed reasonable well for a good choice of filters. However, to obtain good results the filter parameters must be chosen in such a way that they cover the frequency domain sufficiently well. This is often difficult for a small set of filters. We derive and demonstrate that in many situations it can be profitable to normalize the reconstructed image with respect to two global statistical parameters of the original image.
stochastic clutter can often be modeled as a piecewise stationary random field. The individual stationary subregions of homogeneity in the field can then be characterized by marginal density functions. This level of c...
详细信息
ISBN:
(纸本)0819422118;9780819422118
stochastic clutter can often be modeled as a piecewise stationary random field. The individual stationary subregions of homogeneity in the field can then be characterized by marginal density functions. This level of characterization is often sufficient for determination of clutter type on a local basis. We present a technique for the simultaneous characterization of the sub-regions of a random field based on semiparametric density estimation on the entire random field. This technique is based on a borrowed strength methodology that allows the use of observations from potentially dissimilar subregions to improve local density estimation and hence random process characterization. This approach is illustrated through an application to a set of digitized mammogram images which requires the processing five million observations. The results indicate that there is sufficient similarity between images, in addition to the more intuitively obvious within- image similarities, to justify such a procedure. The results are analyzed for the utility of such a procedure to produce superior models in terms of 'stochastic clutter characterization' for target detection applications in which there are variable background processes.
We propose a region extraction method based on a new energy function and a new stochastic sampling method. The new energy function is based on the mixed density description derived by clustering an input image using t...
详细信息
ISBN:
(纸本)0819422118;9780819422118
We propose a region extraction method based on a new energy function and a new stochastic sampling method. The new energy function is based on the mixed density description derived by clustering an input image using the ISO DATA algorithm. Our energy function is suitable for natural images. We developed a new stochastic sampling method by modifying the conventional Gibbs sampler. The conventional Gibbs sampler converges to global optimum of the energy function, but is cannot be applied to region extraction because of its inability to preserve topological property of the initial region during its state transition process. To overcome this drawback, our sampling process is driven by 'dynamic site selection' which enables to preserve the topology of the initial region in the state transition process. We prove the global convergence property of our proposed sampling method by extending the existing stochastic sampling theories. We demonstrate the performances of our method by simulation studies for both synthetic and natural images.
We consider a model stationary problem of wave propagation in a layered halfspace with regular and random inhomogeneities. The choice of regular perturbation corresponds to a linear waveguide near the right boundary o...
详细信息
ISBN:
(纸本)0819422118;9780819422118
We consider a model stationary problem of wave propagation in a layered halfspace with regular and random inhomogeneities. The choice of regular perturbation corresponds to a linear waveguide near the right boundary of the halfspace. Random inhomogeneities are simulated in the framework of the white noise mode. We analyze influence of inhomogeneities on probability distribution of the reflection coefficient phase.
An adaptive approach to restoration of images corrupted by blurring, additive, impulsive and multiplicative noise is proposed. It is based on the combination of nonlinear filters, iterative filtering procedures, and t...
详细信息
ISBN:
(纸本)0819422118;9780819422118
An adaptive approach to restoration of images corrupted by blurring, additive, impulsive and multiplicative noise is proposed. It is based on the combination of nonlinear filters, iterative filtering procedures, and the principles of local adaptation. Finally, numerical simulations and test images illustrating the efficiency of the approach are presented.
The generic technique called the 'Evidences-based image Analysis' is proposed for a model-based object detection. Real images to be analyzed are considered as the sources of evidences generated by the procedur...
详细信息
ISBN:
(纸本)0819422118;9780819422118
The generic technique called the 'Evidences-based image Analysis' is proposed for a model-based object detection. Real images to be analyzed are considered as the sources of evidences generated by the procedures of low-level imageprocessing. These evidences support or refute hypothesis connected with different objects and their features. The Bayesian theorem is of use for hypothesis testing by evidences. The unknown parameters of probabilistic model are used as the internal parameters of algorithm tuning. This approach provides the most uniform and efficient way for the fusion of any available image information: intensity and contour, 2D and 3D, multispectral, multisensor and so on. Our technique takes into account three principal points: object/background model, registration model and corruption model. This paper concentrates mainly on the registration parameters' estimation, especially on the problem of geometrically invariant object detection. It is shown that the Hough-like accumulation methods really implement the maximum a posteriori estimation of the parameters of registration model under the assumption of statistical independence of evidences. The reduction and separation of models are proved to be the legal ways for fastening of the invariant object detection. The usage of complex hierarchical models of objects is considered as another way for fast invariant detection and recognition.
Independent processing of multispectral positron emission tomography (MSPET) data in individual energy frames has the potential to improve system sensitivity and the accuracy of energy dependent scatter correction. Ho...
详细信息
Independent processing of multispectral positron emission tomography (MSPET) data in individual energy frames has the potential to improve system sensitivity and the accuracy of energy dependent scatter correction. However, statistical fluctuations due to the use of multiple energy windows and low system detection efficiency severely undermine this potential, These limitations have been overcome without resolution loss by smoothing data in the energy space to suppress statistical fluctuations and by normalizing detector efficiency in the spatial domain to minimize systematic errors. The effectiveness of these corrections was evaluated by comparing images acquired in different energy frames with and without energy space smoothing. Smoothing improved the sharpness and contrast and decreased noise of images. The FWHM and FWTM evaluated from line source images confirmed an earlier postulate which stated that smoothing in the energy space has no effect on image resolution since such process does not move counts across lines of response (LORs). It was concluded that smoothing in the energy space in conjunction with normalization in the projection space is a prerequisite for subsequent energy-dependent data processing such as scatter correction.
Using thresholding techniques it is possible to separate between contiguous non-homogeneous patches with different power levels. When the power levels of the patches are similar if not equal, the global histogram of t...
详细信息
ISBN:
(纸本)0819422118;9780819422118
Using thresholding techniques it is possible to separate between contiguous non-homogeneous patches with different power levels. When the power levels of the patches are similar if not equal, the global histogram of the patches is unimodal and the thresholding approach becomes very difficult if not impossible. In this paper, we propose to use a statistical procedure to separate between contiguous non-homogeneous patches with similar power levels but different data statistics. The procedure separates different regions by distinguishing between their data probability distributions. The procedure is based on the Ozturk algorithm which uses the sample order statistics for the approximation of univariate distributions.
Rank order filters have a wide variety of applications in imageprocessing as an effective tool for removing noise in images without severely distorting abrupt changes in the images. The k-th rank filter with an m ...
详细信息
ISBN:
(纸本)0819422118;9780819422118
Rank order filters have a wide variety of applications in imageprocessing as an effective tool for removing noise in images without severely distorting abrupt changes in the images. The k-th rank filter with an m × m window sets each pixel the k-th smallest value of the m2 pixels in its m × m neighborhood. The median filter is the most commonly used special case of rank order filters. Rank order filtering requires intensive computation. In this paper, we consider implementation of rank order filters on bit-serial mesh-connected computers such as the Lockheed Martin CISP computer. We design rank filtering algorithms using the threshold decomposition and radix splitting techniques, and present some experimental results of the implementation of those algorithms on the CISP computer.
暂无评论