Imaging has become an essential component in many fields of medical and laboratory research and clinical practice. Biologists study, cells and generate 3D confocal microscopy datasets, virologists generate 3D reconstr...
详细信息
ISBN:
(纸本)0769510043
Imaging has become an essential component in many fields of medical and laboratory research and clinical practice. Biologists study, cells and generate 3D confocal microscopy datasets, virologists generate 3D reconstructions of viruses from micrographs, radiologists identify and quantify tumors from MRI and CT scans, and neuroscientists detect regional metabolic brain activity from PET and functional MRI scans. analysis of these diverse image types requires sophisticated computerized quantification and visualisation tools. Until recently, three-dimensional visualization of images and quantitative analysis could only be performed using expensive UNIX workstations and customized software. Today, much of the visualization and analysis can be performed on an inexpensive desktop computer equipped withthe appropriate graphics hardware and software, this paper introduces all extensible platform-independent, general-purpose image processing and visualization program specifically designed to meet the needs of a lnternet-linked medical research community. the application named MIPAV (Medical Image Processing analysis and visualization) enables clinical and quantitative analysis of medical images over the Internet. Using MIPAV's standard user-interface and analysis tools, researchers and clinicians at remote sites can easily share research data and analyses, thereby enhancing their ability to study, diagnose, monitor, and treat medical disorders.
We present an interactive visualization tool that enables exploration and comparative analyses of election data in multi-partisan systems. We motivate and explain our design in the context of the Ecuadorian political ...
详细信息
there are a variety of graphs where multidimensional feature values are assigned to the nodes. visualization of such datasets is not an easy task since they are complex and often huge. Immersive Analytics is a powerfu...
详细信息
3DVIEWNIX is a data-, machine-, and application-independent software system for the visualization and analysis of multimodality biomedical images. It is based on UNIX, C, X-Window, and our own multidimensional general...
详细信息
ISBN:
(纸本)0780307852
3DVIEWNIX is a data-, machine-, and application-independent software system for the visualization and analysis of multimodality biomedical images. It is based on UNIX, C, X-Window, and our own multidimensional generalization the ACR-NEMA standards for image representation. Its design is image dimensionality independent to make it just as convenient to process 2D and 3D data as it is for higher-dimensional data. Its design is not tied to any specific approach, machine or application. It supports a large variety of visualization and analysis methods that run on from super graphics workstations to PC's for a variety of applications. It is an open, user expandable software system intended to promote cooperative research.
Quality assurance is an important aspect in therapeutic drug monitoring (TDM). Capillary electrophoresis (CE) assays for determination of (i) ethosuximide via direct injection of serum or plasma, (ii) lamotrigine afte...
详细信息
Quality assurance is an important aspect in therapeutic drug monitoring (TDM). Capillary electrophoresis (CE) assays for determination of (i) ethosuximide via direct injection of serum or plasma, (ii) lamotrigine after protein precipitation by acetonitrile and analysis of an aliquot of the acidified supernatant, and (iii) carbamazepine and carbamazepine-10, 11-epoxide after solute extraction followed by analysis of the reconstituted extract are characterized via analysis of a large number of commercial quality control sera containing up to 14 analytes (9 of them are anticonvulsants) in sub-therapeutic, therapeutic and toxicologic concentration levels. CE data obtained in single determinations are shown to compare well withthe spike values and the mean of data determined in other laboratories using immunoassays and/or high-performance liquid chromatography, values that are reported by the external quality control scheme. Carbamazepine and ethosuximide drug levels are also shown to agree well withthose determined in our departmental drug assay laboratory using automated immunoassays. the presented data reveal the effectiveness of assay assessment via analysis of quality control sera and confirm the robustness of the assays for TDM in a routine setting. (C) 2001 Elsevier Science B.V. All rights reserved.
Wireless Sensor Networks (WSN) are often deployed to sample the desired environmental attributes and deliver the acquired samples to the sink for processing, analysis or simulations as per the application needs. Many ...
详细信息
ISBN:
(纸本)9781450308984
Wireless Sensor Networks (WSN) are often deployed to sample the desired environmental attributes and deliver the acquired samples to the sink for processing, analysis or simulations as per the application needs. Many applications stipulate high granularity and data accuracy that results in high data volumes. Sensor nodes are battery powered and sending the requested large amount of data rapidly depletes their energy. Fortunately, the environmental attributes (e.g., temperature, pressure) often exhibit spatial and temporal correlations. Moreover, a large class of applications such as scientific measurement and forensics tolerate high latencies for sensor data collection. Accordingly, we develop a fully distributed adaptive technique for spatial and temporal in-network data compression with accuracy guarantees. We exploit the spatio-temporal correlation of sensor readings while benefiting from possible data delivery latency tolerance to further minimize the amount of data to be transported to the sink. Using real data, we demonstrate that our proposed scheme can provide significant communication/energy savings without sacrificing the accuracy of collected data. In our simulations, we achieved data compression of up to 95% on the raw data requiring around 5% of the original data to be transported to the sink.
the proceedings contain 39 papers. the topics discussed include: incorporating goal analysis in database design: a case study from biological data management;prioritized active integrity constraints for database maint...
the proceedings contain 39 papers. the topics discussed include: incorporating goal analysis in database design: a case study from biological data management;prioritized active integrity constraints for database maintenance;can integrity tolerate inconsistency?;the spicy project: a new approach to data matching;context integration for mobile data tailoring;top-down parameter-free clustering of high-dimensional categorical data;effective incremental clustering for duplicate detection in largedatabases;classifying aggregated data: a symbolic dataanalysis approach;containment of conjunctive queries under access limitations;distributed aggregation strategies for preference queries;progresses on tree-based approaches to improving histogram accuracy;towards a semantic information extraction approach from unstructured documents;and agent-based web services supporting semantic negotiation.
the available genetic data is increasing rapidly, with new high-throughput and low-cost technologies. While this data has enormous potential to impact scientific and medical advances, such data volumes cannot be proce...
详细信息
ISBN:
(纸本)9781479927845
the available genetic data is increasing rapidly, with new high-throughput and low-cost technologies. While this data has enormous potential to impact scientific and medical advances, such data volumes cannot be processed without the use of parallelism. Most of the existing work on analysis of this data has focused on the accuracy of the analyses, and not performance, i.e. either the algorithms are serial and/or very simple and non-scalable parallelization techniques have been used. In this paper, we address the problem of identification of variants in large-scale genome sequencing data. After examining different possible approaches, we identify one which does not require any communication. However, achieving load-balance is non-trivial, because of the data-dependent nature of the processing. We develop three scheduling schemes including a dynamic scheme, which reduces scheduling overheads by using two different chunk sizes, a static scheme, which uses a preprocessing step to estimate workloads, and a combined scheme. In evaluating our schemes, we find that use of a pre-processing step (histogram computation) to estimate workloads is very effective, and thus, our combined scheme gives the best results. With a 32x increase in the number of cores, approximately a 24x performance improvement is seen, establishing that scalable processing of genomic data is possible. We also perform a comparison against an implementation based on Hadoop, and show that with our combined scheme, our implementation outperforms the one using Hadoop.
Mining is a complicated operation. there are many fields involved in mining ranging from exploration, engineering operations to environmental aspects. Withthe advancement of information technologies and mining indust...
详细信息
Mining is a complicated operation. there are many fields involved in mining ranging from exploration, engineering operations to environmental aspects. Withthe advancement of information technologies and mining industry operating techniques, the success of a mining operation is largely dependent on a group's ability to collect, manage, integrate, utilize available data sets and information, and draw the right conclusions. Withthe establishment of the world's first mine site virtual reality (VR) studio at Goldcorp's Red Lake Mine, the comprehensive data integration and immersive 3D datavisualization have been completed at the mine. the advantages of this type of data processing have been well demonstrated and the initial benefits are significant. this paper presents the mine data integration and comprehensive datavisualization at the Red Lake Mine by using a powerful, state-of-the-art technology - virtual reality. It reveals that the VR studio is not only a tool for visualizing mine information to help geologists/engineers view and understand mine engineering data, but it is also a powerful tool for mine data integration, modeling and analysis of disparate but available information. these allow professionals from different departments or disciplines to discuss, coordinate and optimize their plans and designs. VR allows mine operations to instantly see the financial impact of their decisions on mine safety, environment, direction of exploration drillholes, ore grade and excavation rates, mine plans, ground control methods, stress and seismicity, physical placement of stopes, ramps and other structures, as well as infrastructure, and engineering and human resources scheduling. the strategies for data integration and the datavisualization VR operating framework are discussed in this paper. Examples are also given to show the power resulting from data integration and the power of VR on engineering applications.
Many GPU applications perform data transfers to and from GPU memory at regular intervals. For example because the data does not fit into GPU memory or because of internode communication at the end of each time step. O...
详细信息
ISBN:
(纸本)9781479927845
Many GPU applications perform data transfers to and from GPU memory at regular intervals. For example because the data does not fit into GPU memory or because of internode communication at the end of each time step. Overlapping GPU computation with CPU-GPU communication can reduce the costs of moving data. Several different techniques exist for transferring data to and from GPU memory and for overlapping those transfers with GPU computation. It is currently not known when to apply which method. Implementing and benchmarking each method is often a large programming effort and not feasible. To solve these issues and to provide insight in the performance of GPU applications, we propose an analytical performance model that includes PCIe transfers and overlapping computation and communication. Our evaluation shows that the performance models are capable of correctly classifying the relative performance of the different implementations.
暂无评论