The proceedings contain 31 papers. The topics discussed include: entity level data integration by statistical methods;approximate string joins;the virtual data grid: a new model and architecture for data-intensive col...
ISBN:
(纸本)0769519644
The proceedings contain 31 papers. The topics discussed include: entity level data integration by statistical methods;approximate string joins;the virtual data grid: a new model and architecture for data-intensive collaboration;disclosure risk measures for microdata;indexing and incremental updating condensed data cube;on solving the view selection problem in distributed data warehouse architectures;space constrained selection problems for data warehouses and pervasive computing;using bitmap index for interactive exploration of large datasets;stream window join: tracking moving objects in sensor-network databases;supporting sliding window queries for continuous data streams;a visual framework invites human into the clustering process;a strategy selection framework for adaptive prefetching in datavisualization;and a quad-tree based multiresolution approach for two-dimensional summary data.
visualization tools play a key role in the exploration of outer space. Since it is difficult and expensive to send humans to other planets, immersive visualization of such hostile environments is as close as we will g...
详细信息
ISBN:
(纸本)0780381203
visualization tools play a key role in the exploration of outer space. Since it is difficult and expensive to send humans to other planets, immersive visualization of such hostile environments is as close as we will get for some time. visualization is also used in a variety of supporting roles for deep space missions, from simulation and rehearsal of planned operations to analysis of spacecraft state to analysis of science data returned from a variety of instruments. The panelists will discuss their experiences in collecting data in deep space, transmitting it to Earth, processing and visualizing it here, and using the visualization to drive the continued mission. This closes the loop, making missions more responsive to their environment, particularly in-situ operations on planetary surfaces and within planetary atmospheres.
Dynamic SPECT (dSPECT) is a novel technique in nuclear medicine imaging. To find coherent structures within the dataset is the most important part of analysing dSPECT data. Usually the observer focuses on a certain st...
详细信息
ISBN:
(纸本)0819448303
Dynamic SPECT (dSPECT) is a novel technique in nuclear medicine imaging. To find coherent structures within the dataset is the most important part of analysing dSPECT data. Usually the observer focuses on a certain structure or an organ, which is to be identified and outlined. We use a user-guided method where a starting point is interactively selected which is also used to identify the object or structure. To find the starting point for segmentation we search for the voxel having the maximum intensity in the dataset along the eye beam. In the situation where the data is segmented by region growing, we render both, the segmentation result and the original data in one view. The segmentation result is displayed as a wire mesh and fades over the volume rendered original data. We use this hybrid rendering method in order to enable the user to validate the correctness of the segmentation process. So it is possible to compare the two objects in one rendition.
In the age of information and communication technology (ICT), the requirements of multimedia images transmission are essential. Image compression (source coding) has been provided significant improvements of communica...
详细信息
ISBN:
(纸本)0780377737
In the age of information and communication technology (ICT), the requirements of multimedia images transmission are essential. Image compression (source coding) has been provided significant improvements of communication bandwidth, and transmission cost and data rate. The objective of this paper is to develop new techniques for strengthening the compression of still images. The features of the proposed algorithm include: (i) wavelet transform engine, which provide better image compression rate and visual quality as compare to DCT-based system (JPEG), (ii) Zero-path exploit the cross-scale self-similarity of wavelet coefficients thus provides Zero-path compression, (iii) with the absent of side information (significant map) for coding and decoding of Zero-tree structure hence reduces data bit for significant map symbolic coding (iv) Adaptive Arithmetic Entropy Coding provides further compression on generating variable-length code-words. Experimental results show that the proposed algorithm outperforms both JPEG and the EZW algorithms in terms of Peak Signal-to-Noise Ratio (PSNR), data compression bit-rate (bpp), and as well the human vision perception.
Modern computer applications, from business decision support to scientific dataanalysis, utilize datavisualization tools to support exploratory activities. visualexploration tools typically do not scale well when a...
详细信息
Modern computer applications, from business decision support to scientific dataanalysis, utilize datavisualization tools to support exploratory activities. visualexploration tools typically do not scale well when applied to huge data sets, partially because being interactive necessitates real-time responses. However, we observe that interactive visualexplorations exhibit several properties that can be exploited for data access optimization, including locality of exploration, contiguous queries, and significant delays between user operations. We thus apply semantic caching of active query sets on the client side to exploit some of the above characteristics. We also introduce several prefetching strategies, each exploiting characteristics of our visualexploration environment. We have incorporated caching and prefetching strategies into XmdvTool, a public-domain tool for visualexploration of multivariate data sets. Experimental studies using synthetic as well as real user traces are conducted. Our results demonstrate that these proposed optimization techniques achieve significant performance improvements in our exploratory analysis system.
A novel approach is proposed for creating a standardized and comprehensive database for gait analysis. The field of gait analysis is gaining increasing attention for applications such as visual surveillance, human-com...
详细信息
ISBN:
(纸本)0819451258
A novel approach is proposed for creating a standardized and comprehensive database for gait analysis. The field of gait analysis is gaining increasing attention for applications such as visual surveillance, human-computer interfaces, and gait recognition and rehabilitation. Numerous algorithms have been developed for analyzing and processing gait data;however, a standard database for their systematic evaluation does not exist. Instead, existing gait databases consist of subsets of kinematic, kinetic, and electromyographic activity recordings by different investigators, at separate laboratories, and under varying conditions. Thus, the existing databases are neither homogenous nor sufficiently populated to statistically validate the algorithms. In this paper, a methodology for creating a database is presented, which can be used as a common ground to test the performance of algorithms that rely upon external marker data, ground reaction loading data, and/or video images. The database consists of. (i) synchronized motion-capture data (31) marker data) obtained using external markers, (ii) computed joint angles, and (iii) ground reaction loading acquired with plantar pressure insoles. This database could be easily expanded to. include synchronized video, which will facilitate further development of video-based algorithms for motion tracking. This eventually could lead to the realization of markerless gait tracking. Such a system would have extensive applications in gait recognition, as well as gait rehabilitation. The entire database (marker, angle, and force data) will be placed in the public domain, and made available for downloads over the World Wide Web.
This paper summarizes the arc jet test results of the Mars exploration Rover (MER) Silicone Impregnated Reusable Ceramic Ablator (SIRCA) Transverse Impulse Rocket System (TIRS) Cover test series in the Panel Test Faci...
详细信息
ISBN:
(纸本)9781624100970
This paper summarizes the arc jet test results of the Mars exploration Rover (MER) Silicone Impregnated Reusable Ceramic Ablator (SIRCA) Transverse Impulse Rocket System (TIRS) Cover test series in the Panel Test Facility (PTF) at NASA Ames Research Center (ARC). NASA ARC performed aerothermal environment analyses, Thermal Protection System (TPS) sizing and thermal response analyses, and arc jet testing to evaluate the MER SIRCA TIRS Cover design and interface to the aeroshell structure. The primary objective of this arc jet test series was to evaluate specific design details of the SIRCA TIRS Cover interface to the MER aeroshell under simulated atmospheric entry heating conditions. Four test articles were tested in an arc jet environment with three different seal configurations. The test condition was designed to match the predicted peak flight heat load at the gap region between the SIRCA and the backshell TPS material, SLA-561S, and resulted in an over-test (with respect to heat flux and heat load) for the apex region of the SIRCA TIRS Cover. The resulting pressure differential was as much as twenty times that predicted for the flight case, depending on the location. There was no post-test visual evidence of over-heating or damage to any interfaces to the backshell structure. Repeatable thermocouple data were obtained and compared to SIRCA thermal response analyses. The one-dimensional thermal response prediction compared well with the thermocouple data for the location at the backshell TPS interface. For the apex region of the SIRCA TIRS Cover, a one-dimensional thermal response analysis resulted in an over-prediction, as there were strong multi-dimensional conduction effects due to the TIRS Cover geometry. In general, the test results provide strong experimental evidence that supports the adequacy of the baseline seal design.
Information visualization exploits the phenomenal abilities of human perception to identify structures by presenting abstract datavisually, allowing an intuitive exploration of data to get insight, to draw conclusion...
详细信息
Information visualization exploits the phenomenal abilities of human perception to identify structures by presenting abstract datavisually, allowing an intuitive exploration of data to get insight, to draw conclusions and to interact directly with the data. The specification, analysis and evaluation of complex models and simulated model data can benefit from information visualization techniques by obtaining visual support for different tasks. This paper presents an approach that combines modelling and visualization functionality to support the modelling process. Based on this general approach, we have developed and implemented a framework that allows us to combine a variety of models with statistical and analytical operators as well as with visualization methods. We present several examples in the context of climate modelling.
The Internet pervades many aspects of our lives and is becoming indispensable to critical functions in areas such as commerce, government, production and general information dissemination. To maintain the stability an...
详细信息
The Internet pervades many aspects of our lives and is becoming indispensable to critical functions in areas such as commerce, government, production and general information dissemination. To maintain the stability and efficiency of the Internet, every effort must be made to protect it against various forms of attacks, malicious users, and errors. A key component in the Internet security effort is the routine examination of Internet routing data, which unfortunately can be too large and complicated to browse directly. We have developed an interactive visualization process which proves to be very effective for the analysis of Internet routing data. In this application paper, we show how each step in the visualization process helps direct the analysis and glean insights from the data. These insights include the discovery of patterns, detection of faults and abnormal events, understanding of event correlations, formation of causation hypotheses, and classification of anomalies. We also discuss lessons learned in our visualanalysis study.
暂无评论