Advanced medical imaging technologies have enabled biologists and other researchers in biomedicine, biochemistry and bio-informatics to gain better insight in complex, large-scale data sets. These datasets, which occu...
详细信息
ISBN:
(纸本)0819444057
Advanced medical imaging technologies have enabled biologists and other researchers in biomedicine, biochemistry and bio-informatics to gain better insight in complex, large-scale data sets. These datasets, which occupy large amounts of space, can no longer archived on local hard drives. San Diego Supercomputer Center (SDSC) maintains a large data repository, called High Performance Storage System (HPSS), where large-scale biomedical data sets can be stored. These data sets must be transmitted over an open or closed network (Internet or Intranet) within a reasonable amount of time to make them accessible in an interactive fashion to researchers all over the world. Our approach deals with extracting and transforming these data sets using Haar wavelets, and transmitting them over a low- to medium-bandwidth network. These compressed data sets are then reconstructed into a 3-D volume on the client side, and rendered using texture mapping in Java3D. The data sets are handled using the Scalable visualization Toolkits developed within an NPACI (National Partnership for Advanced Computational Infrastructure) framework. Sub-volumes of the data sets are extracted to provide a detailed view of a particular region of interest (ROI). This application has also been ported to a C++ platform to obtain higher rendering speed and better performance but lacks platform independency.
Time varying simulations are common in many scientific domains to study the evolution of phenomena or features. The data produced in these simulations is massive. Instead of just one dataset of 512(3) or 1024(3) (for ...
详细信息
ISBN:
(纸本)0819444057
Time varying simulations are common in many scientific domains to study the evolution of phenomena or features. The data produced in these simulations is massive. Instead of just one dataset of 512(3) or 1024(3) (for regular gridded simulations) there could now be hundreds to thousands of timesteps. For datasets with evolving features, feature analysis and visualization tools are crucial to help interpret all the information. For example, it is usually important to know how many regions are evolving, what are their lifetimes, do they merge with others, how does the volume/mass change, etc. Therefore, feature based approaches, such as feature tracking and feature quantification are needed to follow identified regions over time. In our previous work, we have developed a methodology for analyzing time-varying datasets which tracks 3D amorphous features as they evolve in time. However, the implementation is for single-processor non-adaptive grids and for massive multiresolution datasets this approach needs to be distributed and enhanced. In this paper, we describe extensions to our feature extraction and tracking methodology for distributed AMR simulations. Two different paradigms are described, a "fully distributed" and a "partial-merge" strategy. The benefits and implementations of both are discussed.
Breakdown analysis involves decomposing data into sub-groups to allow for comparison and identification of problem areas. Good analysis requires the ability to group data based on attributes or values. Breakdown Visua...
详细信息
ISBN:
(纸本)1581134541
Breakdown analysis involves decomposing data into sub-groups to allow for comparison and identification of problem areas. Good analysis requires the ability to group data based on attributes or values. Breakdown visualization provides a mechanism to support this analysis through user guided decomposition and exploration of tabular data with a polyarchy structure. This is useful in domains such as sports statistics and corporate financial reports. Breakdown visualization utilizes a spreadsheet format for comparison of adjacent visualizations.
This paper introduces an immersive virtual reality application that allows users to browse and explore the contents of database systems. We have implemented a visualization metaphor that is based upon the intrinsic ch...
详细信息
ISBN:
(纸本)0769516564
This paper introduces an immersive virtual reality application that allows users to browse and explore the contents of database systems. We have implemented a visualization metaphor that is based upon the intrinsic characteristics of particles, coined 'infoticles', which are used as representations of data objects. Users are able to interact with the dynamic, three-dimensional visualization by manipulating forces and surfaces. These tools, representing respectively user interests and data filters, influence the collection of infoticles according to the rules of Newtonian mechanics. Informational values are expressed through the presence of both dynamic and static characteristics such as motion, directionality, and form. We demonstrate these principles trough a prototype that uses our university's financial budget data.
Currently, the cDNA and genomic sequence projects are processing at such a rapid rate that more and more gene data become available. New methods are needed to efficiently and effectively analyze and visualize this dat...
详细信息
One of the challenges of visualization software design is providing real-time tools capable of concurrently displaying data that varies temporally and in scale from kilometers to micrometers, such as the data prevalen...
详细信息
ISBN:
(纸本)081945642X
One of the challenges of visualization software design is providing real-time tools capable of concurrently displaying data that varies temporally and in scale from kilometers to micrometers, such as the data prevalent in planetary exploration and deep-sea marine research. The Viz software developed by NASA Ames and the additions of the X-Core extensions solve this problem by providing a flexible framework for rapidly developing visualization software capable of accessing and displaying large dynamic data sets. This paper describes the Viz/X-Core design and illustrates the operation of both systems over a number of deployments ranging from marine research to Martian exploration. Highlights include a 2002 integration with live ship operations and the Mars Exploration Rovers Spirit and Opportunity.
An automatic image segmentation method is used to improve processing and visualization of data obtained by electron microscopy. Exploiting affinity criteria between pixels, e.g., proximity and gray level similarity, i...
详细信息
An automatic image segmentation method is used to improve processing and visualization of data obtained by electron microscopy. Exploiting affinity criteria between pixels, e.g., proximity and gray level similarity, in conjunction with an eigenvector analysis, the image is subdivided into areas which correspond to objects or meaningful regions. Extending a proposal by Shi and Malik (1997, Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pp. 731-737) the approach was adapted to the field of electron microscopy, especially to three-dimensional application as needed by electron tomography. Theory, implementation, parameter setting, and results obtained with a variety of data are presented and discussed. The method turns out to be a powerful tool for visualization with the potential for further improvement by developing and tuning new affinity. (C) 2002 Elsevier Science (USA). All rights reserved.
This work focuses on the attempt to establish an analogue and contiguity between visualization and holography. The majority of methods for processing and analysisdata work in the virtual space with its virtual images...
详细信息
ISBN:
(纸本)0819444057
This work focuses on the attempt to establish an analogue and contiguity between visualization and holography. The majority of methods for processing and analysisdata work in the virtual space with its virtual images and shapes. Scientific visualization is an extraction of the significant information from the data space and its presentation as a visual shapes, applicable to the "visual thinking" (R. Arnheim). A holography gives us possibility to percept the visual shapes as the virtual object in a form of a real 3D copy, i.e. hologram. We propose to add a light field surrounding the virtual object in description of its computer model. The object is determined as the set of points emit the coherent tight. A scheme of direct calculation and printing the digital hologram of the virtual object is also taken into consideration. The result of realization of the scheme for the virtual shape and for the real object is indistinguishable on the stage of reconstruction. This brings us to the concept of the Real Virtuality.
Trace visualization. is a way to understand the behavior of an application on computer systems. In this paper we describe the design and implementation of a Gantt Chart visualization tool for visualizing multi-dimensi...
详细信息
ISBN:
(纸本)0769517609
Trace visualization. is a way to understand the behavior of an application on computer systems. In this paper we describe the design and implementation of a Gantt Chart visualization tool for visualizing multi-dimensional trace files, particularly traces from Message Passing Interface (MPI) programs and Apache servers. An MPI tracing library is developed for MPI programs, and a Apache server plug-in is developed to generate web traces. Since the amount of trace data may be large, utilities are provided to convert and merge multiple event trace files into one scalable, self-defining interval file for visualization. The interval format facilitates the development of multiple time-space diagrams for visualization tools. For MPI traces, each record also includes an instruction address for source code association, which provides a way to display the source code and pin-point the line that generates the event.
Visual analysis of time varying scientific data can be divided into four different categories with an increasing degree of user interaction. 1) Production of static images representing scientific data at selected time...
详细信息
ISBN:
(纸本)1581135254
Visual analysis of time varying scientific data can be divided into four different categories with an increasing degree of user interaction. 1) Production of static images representing scientific data at selected times. 2) Production of video sequences in which graphical representation, time line and viewpoints are predefined. 3) Interactive streaming of logged data sets, allowing the user to alter graphical representation, filtering, time lines and viewpoints. 4) Real time interaction with the simulation or experiment that produces the data, allowing the user to alter parameters, graphical representation, filtering, time lines and viewpoints.
暂无评论