Background: Chromatin immunoprecipitation sequencing (ChIP-seq) is a technology that combines chromatin immunoprecipitation (ChIP) with next generation of sequencing technology (NGS) to analyze protein interactions wi...
详细信息
Background: Chromatin immunoprecipitation sequencing (ChIP-seq) is a technology that combines chromatin immunoprecipitation (ChIP) with next generation of sequencing technology (NGS) to analyze protein interactions with DNA. At present, most ChIP-seq analysis tools adopt the command line, which lacks user-friendly interfaces. Although some web services with graphical interfaces have been developed for ChIP-seq analysis, these sites cannot provide a comprehensive analysis of ChIP-seq from raw data to downstream analysis. Results: In this study, we develop a web service for the whole process of ChIP-Seq analysis (CSA), which covers mapping, quality control, peak calling, and downstream analysis. In addition, CSA provides a customization function for users to define their own workflows. And the visualization of mapping, peak calling, motif finding, and pathway analysis results are also provided in CSA. For the different types of ChIP-seq datasets, CSA can provide the corresponding tool to perform the analysis. Moreover, CSA can detect differences in ChIP signals between ChIP samples and controls to identify absolute binding sites. Conclusions: the two case studies demonstrate the effectiveness of CSA, which can complete the whole procedure of ChIP-seq analysis. CSA provides a web interface for users, and implements the visualization of every analysis step. the website of CSA is available at http://***.
this paper presents a model for the progressive visualization and exploration of the structure of largedatasets. that is, an abstraction on different components and relations which provide means for constructing a vi...
详细信息
ISBN:
(纸本)9783030415907;9783030415891
this paper presents a model for the progressive visualization and exploration of the structure of largedatasets. that is, an abstraction on different components and relations which provide means for constructing a visual representation of a dataset's structure, with continuous system feedback and enabled user interactions for computational steering, in spite of size. In this context, the structure of a dataset is regarded as the distance or neighborhood relationships among its data points. Size, on the other hand, is defined in terms of the number of data points. To prove the validity of the model, a proof-of-concept was developed as a Visual Analytics library for Apache Zeppelin and Apache Spark. Moreover, nine user studies where carried in order to assess the usability of the library. the results from the user studies show that the library is useful for visualizing and understanding the emerging cluster patterns, for identifying relevant features, and for estimating the number of clusters k.
Over the last few years, dramatic increases and advances in mass storage for both secondary and tertiary storage made possible the handling of big amounts of data (for example, satellite data, complex scientific exper...
详细信息
Over the last few years, dramatic increases and advances in mass storage for both secondary and tertiary storage made possible the handling of big amounts of data (for example, satellite data, complex scientific experiments, and so on). However, to the full use of these advances, metadata for dataanalysis and interpretation, as well as the complexity of managing and accessing largedatasets through intelligent and efficient methods, are still considered to be the main challenges to the information-science community when dealing withlargedatabases. Scientific data must be analyzed and interpreted by metadata, which has a descriptive role for the underlying data. Metadata can be, partly, a priori definable according to the domain of discourse under consideration (for example, atmospheric chemistry) and the conceptualization of the information system to be built. It may also be extracted by using learning methods from time-series measurement and observation data. In this paper, a knowledge-based management system (KBMS) is presented for the extraction and management of metadata in order to bridge the gap between data and information. the KBMS is a component of an intelligent information system based upon a federated architecture, also including a database management system for time-series-oriented data and a visualization system.
作者:
WILLCOCKS, PHICI Materials
Wilton Research Centre P.O. Box 90 Wilton Middlesbrough Cleveland TS90 8JE UK
the use of thermal analysis as part of the quality systems in industry can be effective only if the techniques are made to conform to high standards of quality assurance. Achieving the high standards required is not a...
详细信息
the use of thermal analysis as part of the quality systems in industry can be effective only if the techniques are made to conform to high standards of quality assurance. Achieving the high standards required is not always straightforward and there are a large number of potential and unresolved problems. Fundamental aspects which have to be considered include limitations due to instrument design, computerised control and analysis, validation of data and the overall requirements of, or conformance to, international quality programmes.
For C programs, flow-sensitivity is important to enable pointer analysis to achieve highly usable precision. Despite significant recent advances in scaling flow-sensitive pointer analysis sparsely for sequential C pro...
详细信息
ISBN:
(纸本)9781450337786
For C programs, flow-sensitivity is important to enable pointer analysis to achieve highly usable precision. Despite significant recent advances in scaling flow-sensitive pointer analysis sparsely for sequential C programs, relatively little progress has been made for multithreaded C programs. In this paper, we present FSAM, a new Flow-Sensitive pointer analysisthat achieves its scalability for large Multithreaded C programs by performing sparse analysis on top of a series of thread interference analysis phases. We evaluate FSAM with 10 multithreaded C programs (with more than 100K lines of code for the largest) from Phoenix-2.0, Parsec-3.0 and open-source applications. For two programs, raytrace and x264, the traditional data-flow-based flow-sensitive pointer analysis is unscalable (under two hours) but our analysis spends just under 5 minutes on raytrace and 9 minutes on x264. For the rest, our analysis is 12x faster and uses 28x less memory.
Mining is a complicated operation. there are many fields involved in mining ranging from exploration, engineering operations to environmental aspects. Withthe advancement of information technologies and mining indust...
详细信息
Mining is a complicated operation. there are many fields involved in mining ranging from exploration, engineering operations to environmental aspects. Withthe advancement of information technologies and mining industry operating techniques, the success of a mining operation is largely dependent on a group's ability to collect, manage, integrate, utilize available data sets and information, and draw the right conclusions. Withthe establishment of the world's first mine site virtual reality (VR) studio at Goldcorp's Red Lake Mine, the comprehensive data integration and immersive 3D datavisualization have been completed at the mine. the advantages of this type of data processing have been well demonstrated and the initial benefits are significant. this paper presents the mine data integration and comprehensive datavisualization at the Red Lake Mine by using a powerful, state-of-the-art technology - virtual reality. It reveals that the VR studio is not only a tool for visualizing mine information to help geologists/engineers view and understand mine engineering data, but it is also a powerful tool for mine data integration, modeling and analysis of disparate but available information. these allow professionals from different departments or disciplines to discuss, coordinate and optimize their plans and designs. VR allows mine operations to instantly see the financial impact of their decisions on mine safety, environment, direction of exploration drillholes, ore grade and excavation rates, mine plans, ground control methods, stress and seismicity, physical placement of stopes, ramps and other structures, as well as infrastructure, and engineering and human resources scheduling. the strategies for data integration and the datavisualization VR operating framework are discussed in this paper. Examples are also given to show the power resulting from data integration and the power of VR on engineering applications.
Imaging has become an essential component in many fields of medical and laboratory research and clinical practice. Biologists study, cells and generate 3D confocal microscopy datasets, virologists generate 3D reconstr...
详细信息
ISBN:
(纸本)0769510043
Imaging has become an essential component in many fields of medical and laboratory research and clinical practice. Biologists study, cells and generate 3D confocal microscopy datasets, virologists generate 3D reconstructions of viruses from micrographs, radiologists identify and quantify tumors from MRI and CT scans, and neuroscientists detect regional metabolic brain activity from PET and functional MRI scans. analysis of these diverse image types requires sophisticated computerized quantification and visualisation tools. Until recently, three-dimensional visualization of images and quantitative analysis could only be performed using expensive UNIX workstations and customized software. Today, much of the visualization and analysis can be performed on an inexpensive desktop computer equipped withthe appropriate graphics hardware and software, this paper introduces all extensible platform-independent, general-purpose image processing and visualization program specifically designed to meet the needs of a lnternet-linked medical research community. the application named MIPAV (Medical Image Processing analysis and visualization) enables clinical and quantitative analysis of medical images over the Internet. Using MIPAV's standard user-interface and analysis tools, researchers and clinicians at remote sites can easily share research data and analyses, thereby enhancing their ability to study, diagnose, monitor, and treat medical disorders.
We present an interactive visualization tool that enables exploration and comparative analyses of election data in multi-partisan systems. We motivate and explain our design in the context of the Ecuadorian political ...
详细信息
3DVIEWNIX is a data-, machine-, and application-independent software system for the visualization and analysis of multimodality biomedical images. It is based on UNIX, C, X-Window, and our own multidimensional general...
详细信息
ISBN:
(纸本)0780307852
3DVIEWNIX is a data-, machine-, and application-independent software system for the visualization and analysis of multimodality biomedical images. It is based on UNIX, C, X-Window, and our own multidimensional generalization the ACR-NEMA standards for image representation. Its design is image dimensionality independent to make it just as convenient to process 2D and 3D data as it is for higher-dimensional data. Its design is not tied to any specific approach, machine or application. It supports a large variety of visualization and analysis methods that run on from super graphics workstations to PC's for a variety of applications. It is an open, user expandable software system intended to promote cooperative research.
this paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. the sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on...
详细信息
ISBN:
(纸本)9781538646434
this paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. the sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. the approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. the calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m x 14m x 5m a typical accuracy of the generated point cloud data of 5-10cm was achieved using six sensor nodes.
暂无评论