The evolutionary theory of color perception is described. It is based on three main propositions: 1) The physical nature of light and the environmental distributions of its information carrying parameters, 2) Constrai...
详细信息
ISBN:
(纸本)0819402974
The evolutionary theory of color perception is described. It is based on three main propositions: 1) The physical nature of light and the environmental distributions of its information carrying parameters, 2) Constraints implied by the available biological material and physiological processes, and 3) The evolutionary tendency toward optimal usefulness for the survival of the species. The theory leads directly to the main properties of color perception: a) Newtonian color circle, metamers, additive and subtructive color mixtures, b) Adaptive (relativistic) transformations and color constancy (invariance), c) An operational procedure of color measurements. Among the major predictions are Lorentz type formulas for color transformations, and the explanation of two-color projections. Predictions based on color transformations will be demonstrated with a simple demonstration kit.
In this paper, we discuss advanced architectures for distributed sensor networks. This includes the development of efficient algorithms for data-combination, noise removal, and information abstraction. Specifically, t...
详细信息
ABSTRACTIn this paper we develop two entropy-coded subband image coding schemes. The difference betweenthese schemes is the procedure used for encoding the lowest frequency subband: predictive coding isused in one sys...
详细信息
ISBN:
(纸本)0819402915
ABSTRACT
In this paper we develop two entropy-coded subband image coding schemes. The difference between
these schemes is the procedure used for encoding the lowest frequency subband: predictive coding is
used in one system and transform coding in the other. Other subbands are encoded using zero-memory
quantization. After a careful study of subband statistics, the quantization parameters, the corresponding
Huffman codes and the bit allocation among subbands are all optimized. It is shown that both schemes
perform considerably better than the scheme developed by Woods and O'Neil [2]. Roughly speaking,
these new schemes perform the same as that in [2] at half the encoding rate. To make a complete
comparison against the results in [2] , we have studied the performance of the two schemes developed here
as well as that of [2] in the presence of channel noise. After developing a codeword packetization scheme,
we demonstrate that the scheme in [2] exhibits significantly higher robustness against the transmission
noise.
A method for comparing reconstruction algorithms is presented based on the ability to perform certain detection tasks on the resulting images. The reconstruction algorithms compared are the algebraic reconstruction te...
详细信息
ISBN:
(纸本)0819402753
A method for comparing reconstruction algorithms is presented based on the ability to perform certain
detection tasks on the resulting images. The reconstruction algorithms compared are the algebraic
reconstruction technique (ART) and the maximum entropy reconstruction method (MaxEnt). Task
performance is assessed through a Monte Carlo simulation of the complete imaging process, including the
generation of a set of object scenes, followed by data-taking, reconstruction, and performance of the
specified task by a machine observer. For these detection tasks the figure of merit used for comparison is
the detectability index, d'. When each algorithm is run with approximately optimized parameters, these
studies find comparable values for d'.
We describe the design of an image-recognition system and its performance on multi-sensor imagery. The system satisfies a list of natural requirements, which includes locality of inferences (for efficient VLSI impleme...
详细信息
ISBN:
(纸本)081940294X
We describe the design of an image-recognition system and its performance on multi-sensor imagery. The system satisfies a list of natural requirements, which includes locality of inferences (for efficient VLSI implementation), incorporation of prior knowledge, multi-level hierarchies, and iterative improvement. Two of the most important new features are: a uniform parallel architecture for low-, mid- and high- level vision; and achievement of recognition through short-, as opposed to its long-time behavior, of a dynamical system. Robustness depends on collective effects rather than high precision of the processing elements. The resulting network displays a balance of high speed and small size. We also indicate how this architecture is related to the Dempster-Shafer calculus for combining evidence from multiple sources, and present novel methods of learning in such networks, including one that addresses the integration of model-based and data-driven approaches.
We describe the design of a target recognition system. The distinctive feature of this system is the integration of model-based and data-driven approaches to target recognition. This necessitates achievement of recogn...
详细信息
ISBN:
(纸本)0819403458
We describe the design of a target recognition system. The distinctive feature of this system is the integration of model-based and data-driven approaches to target recognition. This necessitates achievement of recognition through short-time behavior as opposed to longtime behavior of a dynamical system. The system also satisfies a list of natural requirements which includes locality of inferences (for efficient VLSI implementation) incorporation of prior knowledge multi-level hierarchies and iterative improvement. The architecture is uniformly parallel for low- and mid- as well as high-level vision. Robustness depends on collective effects rather than high precision of the processing elements. 1.
Digital Pulsed Laser Velocimetry (DPLV) is a novel full-field, two dimensional, noninvasive, quantitative flowvisualization technique. The technique described here includes the use of direct digitization of the images...
详细信息
ISBN:
(纸本)0819402915
Digital Pulsed Laser Velocimetry (DPLV) is a novel full-field, two dimensional, noninvasive, quantitative flow
visualization technique. The technique described here includes the use of direct digitization of the images for flow
analysis using a high resolution imaging system. The image data is stored for further analysis by a series of new image
processing and analysis software developed for flow experiments.
The imaging processing and analysis software developed includes a compression program for reducing storage
requirements of the image data to 10%. An image finding, smoothing, and defining program has also been developed.
Analysis time has been greatly reduced and the software is now running on a PC/AT compatible. This program groups
pixels that could logically be defined as one image, smooths that image and calculates important parameters for the
image.
In the technique images via a high resolution camera (1024 x 1024), ten consecutive frames of data, separated by a
time increment of 150 ms, are recorded. Each of these ten frames contains the images of particles at that one instant of
time. A third computer program is developed to match the image from each of the frames into tracks of the particles
through time. The program uses a statistical technique to determine the best possible path of the fluid seeds.
The ability of pulsed laser velocimetry with these image processing techniques to capture simultaneous and
quantitative rather than qualitative information is its most important capability.
Most databases for spherically distributed data are not structured in a manner consistent with their geometry.As a result, such databases possess undesirable artifacts, including the introduction of "tears" ...
详细信息
ISBN:
(纸本)0819403067
Most databases for spherically distributed data are not structured in a manner consistent with their geometry.
As a result, such databases possess undesirable artifacts, including the introduction of "tears" in the data when they
are mapped onto a flat file system. Furthermore, it is difficult to make queries about the topological relationship
among the data components without performing real arithmetic. Therefore, a new representation for spherical data
is introduced called the sphere quadree, which is based on the recursive subdivision of spherical triangles obtained by
projecting the faces of an icosahedron onto a sphere. Sphere quadtrees allow the representation of data at multiple
levels and arbitrary resolution. For actual data, such a hierarchical data structure provides the ability to correlate
geographic data by providing a consistent reference among data sets of different resolutions or data that are not
geographically registered. Furthermore, efficient search strategies can be easily implemented for the selection of
data to be rendered or analyzed by a specific technique. In addition, sphere quadtrees offer significant potential for
improving the accuracy and efficiency of spherical surface rendering algorithms as well as for spatial data management
and geographic informationsystems.
Technology to implement very high speed digital signal processing has experienced rapid growth during the last decade. Techniques using this technology to reduce the data channel width required by an imaging reconnais...
详细信息
Technology to implement very high speed digital signal processing has experienced rapid growth during the last decade. Techniques using this technology to reduce the data channel width required by an imaging reconnaissance system to achieve the timely transmission of data from sensor subsystems to remote exploitation subsystems are being developed at an increasing rate. The incorporation of these techniques into imaging reconnaissance systems promises payoffs in performance and co-operability with other collection systems within the restrictions imposed by existing and near-term projected data channels. This paper presents a summary overview of techniques available to designers using current technology. An analysis performed on a hypothetical system is used to illustrate the sensitivities of various algorithms to characteristics of and errors in individual system components;an analysis of the impact of the techniques considered on system performance. The techniques treated are restricted to lossy algorithms which preserve the essential imaging nature of the system to the end user i.e., the system output is in image form, not alphanumeric. Performance is defined as the ability of the system to collect and reconstruct scene image information. DPCM, ADPCM, and DCT compression techniques will be considered in the detailed treatment.
Proceedings incorporates 22 papers that are grouped into four sessions. These deal with: image workstations;office/document processing;image analysis systems;and desktop system integration for the graphic arts. Topics...
详细信息
ISBN:
(纸本)0819403059
Proceedings incorporates 22 papers that are grouped into four sessions. These deal with: image workstations;office/document processing;image analysis systems;and desktop system integration for the graphic arts. Topics considered include: algorithms, video-to-printing image resolution, page description language, optical imaging memory, color images, character generation, contour representation, color spaces, dataflow architecture, handwritten numerical verification, character recognition, fuzzy sets, gradient methods, image segmentation, document images, and geographic information system.
暂无评论