This paper describes the core theories and enabling technologies developed for molecular imaging at the BMIT Group and the CMSP Center over the last 10 years, in the areas of dynamic image data acquisition, compressio...
详细信息
ISBN:
(纸本)1920682112
This paper describes the core theories and enabling technologies developed for molecular imaging at the BMIT Group and the CMSP Center over the last 10 years, in the areas of dynamic image data acquisition, compression, storage, management, modeling, simulation, analysis, processing, registration, and visualization.
Dealing with visualizations containing large data set is a challenging issue and, in the field of information visualization, almost every visual technique reveals its drawback when visualizing large number of items. T...
详细信息
Dealing with visualizations containing large data set is a challenging issue and, in the field of information visualization, almost every visual technique reveals its drawback when visualizing large number of items. To deal with this problem we introduce a formal environment, modeling in a virtual space the image features we are interested in (e.g, absolute and relative density, clusters, etc.) and we define some metrics able to characterize the image decay. Such metrics drive our automatic techniques (i.e., not uniform sampling) rescuing the image features and making them visible to the user. In This work we focus on 2D scatter-plots, devising a novel non uniform data sampling strategy able to preserve in an effective way relative densities.
Functional magnetic resonance imaging (fMRI) is a complex imaging modality that provides high resolution, non-invasive maps of neural activity in brain tissue. Neuroscientists use fMRI to probe brain function using co...
详细信息
Functional magnetic resonance imaging (fMRI) is a complex imaging modality that provides high resolution, non-invasive maps of neural activity in brain tissue. Neuroscientists use fMRI to probe brain function using complex cognitive and linguistic experiments. An important aspect of these experiments is the visualization of neural activations over a period of time as manifested by voxel intensity of two (or three) dimensional images across the temporal analysis dimension. Scopira is a modular algorithm development framework that consists of a user-friendly visual layout environment with a comprehensive set of scientific algorithms for biomedical dataanalysis. However, presently it lacks the facility for volumetric display, which is especially important to map neural activations from functional (low resolution) to anatomical (high resolution) images. In this paper, we present a volumetric display and analysis system for fMRI data using Scopira and the OpenGL library. Results have been presented to demonstrate the new software.
Considering the scalability of using formal concept analysis to locate features in source code, we present a set of alternative straightforward algorithms to achieve the same objectives. A preliminary experiment indic...
详细信息
ISBN:
(纸本)0769522130
Considering the scalability of using formal concept analysis to locate features in source code, we present a set of alternative straightforward algorithms to achieve the same objectives. A preliminary experiment indicates that the alternative algorithms are more scalable to deal with the large numbers of data to some extent.
Telecommunications networks move towards wireless applications and service and business driven network management. Operators provide new services on different access technologies, in a multi-technology environment. Th...
详细信息
Telecommunications networks move towards wireless applications and service and business driven network management. Operators provide new services on different access technologies, in a multi-technology environment. This shift in abstraction level towards services, away from technology and network element management, requires new tools and methods to support the operational tasks, of which service monitoring and traffic analysis is an especially important area. In this paper, the concept of a neural network based tool that aids the operators in detecting the quality of end-user experience (QoE) is introduced.
Options are amongst the most heavily transacted financial instruments in the world. This paper examines how the methods of visual exploratory tools, espoused by Cleveland (1993) can be used to analyze the residuals fr...
详细信息
Options are amongst the most heavily transacted financial instruments in the world. This paper examines how the methods of visual exploratory tools, espoused by Cleveland (1993) can be used to analyze the residuals from conventional option pricing models (Black and Scholes, 1972; Black, 1976). Until recently, these models were believed to be unbiased (Rubinstein, 1985; Lajbcygier, 1999). With the aid of visual exploratory tools we see that options on the All Ordinaries share price index trading on the Sydney Futures exchange have persistent, systematic and significant bias. This is the first time that various statistically oriented visual exploratory tools have been used to analyze option residuals. We find that the analysis motivates the use of alternative option pricing methods.
Expertise management systems are being widely adopted in organizations to manage tacit knowledge embedded in employees' heads. These systems have successfully applied many information technologies developed in fie...
详细信息
Expertise management systems are being widely adopted in organizations to manage tacit knowledge embedded in employees' heads. These systems have successfully applied many information technologies developed in fields such as information retrieval and document management to support expertise information collection, processing, and distribution. In this paper, we investigate the potentials of applying visualization techniques to support exploration of an expertise space. We implemented two widely applied dimensionality reduction visualization techniques, the self-organizing map and multidimensional scaling; to generate expert map and expertise field map visualizations based on an expertise data set. Our proposed approach is generic for automatic mapping of expertise space of an organization, research field, scientific domain, etc. Our initial analysis on the visualization results indicated that the expert map and expertise field map captured useful underlying structures of the expertise space and had the potential to support more efficient and effective expertise information searching and browsing.
This paper describes the integration of stability assessment tools in the EMS. The main functions are the transient security and voltage security tools. They are invoked under several modes of operation to address the...
详细信息
This paper describes the integration of stability assessment tools in the EMS. The main functions are the transient security and voltage security tools. They are invoked under several modes of operation to address the current state of the system as well as the immediate short-term planning horizons. data exchange, preparation and output visualization are described in the context of distributed architectures and the latest user interface technology. The operators will be able to analyze transactions under various conditions and contingences that could potentially lead to insecure behavior. The dynamic tools are triggered as part of the on-line network and study network sequences. The architecture and system configuration is the key to ensuring fast, robust and accurate dynamic solutions. Several CPUs are used to reduce the execution time for day and week-ahead analysis. The user friendly data exchange setup prevents conflict between several instances of DSA users and models.
data farming leverages high performance computing to run simple models many times. This process allows for the exploration of massive parameter spaces relatively quickly. This paper explores a methodology to use data ...
详细信息
ISBN:
(纸本)9780780387867
data farming leverages high performance computing to run simple models many times. This process allows for the exploration of massive parameter spaces relatively quickly. This paper explores a methodology to use data farming as a decision support tool. data farming can be a highly effective in this role because it allows one to present to a decision-maker not only what may be the most likely outcome but what are possible outcomes, especially outliers that might have far reaching impact. The terrorist attacks of September 2001 are a good example of an outlier with very high impact. A case study is presented using a simple terrorist attack simulation and decision-maker utility model.
暂无评论