Most positions of the human genome are typically invariant (99%) and only some positions (1%) are commonly invariant which are associated with complex genetic diseases. Haplotype reconstruction problem divide aligned ...
详细信息
Most positions of the human genome are typically invariant (99%) and only some positions (1%) are commonly invariant which are associated with complex genetic diseases. Haplotype reconstruction problem divide aligned single nucleotide polymorphism (SNP) fragments into two classes and infer a pair of haplotypes from them. An important computational model of this problem is minimum error correction (MEC) but it is only effective when the error rate of the fragments is low. MEC/GI as an extension to MEC employs the compatible genotype information besides the SNP fragments and so results in a more accurate inference. The haplotyping problems, due to its NP-hardness, several computational and heuristic methods have addressed the problem seeking feasible answers. In this paper, we develop a new branch-and-bound algorithm with running time O([(n-h)/k]2 h × nm) in which m is maximum length of SNP fragments where SNP sites are heterozygous, n is the number of fragments and h is dept. of our exploration in binary tree. Since h (h≪n) is small in real biological applications, our proposed algorithm is practical and efficient.
We present our benchmark results for a newly developed multi-phase chemodynamic code by using the raceSPH library on *** hardware accelerators. A maximum speedup of 24x is reached for the SPH part in the serial run wh...
详细信息
Based on recent advances on convex design for Large-Scale Control Systems (LSCSs) and robust and efficient LSCS self-tuning/adaptation, a methodology is proposed in this paper which aims at providing an integrated LSC...
详细信息
ISBN:
(纸本)9781612848006
Based on recent advances on convex design for Large-Scale Control Systems (LSCSs) and robust and efficient LSCS self-tuning/adaptation, a methodology is proposed in this paper which aims at providing an integrated LSCS-design, applicable to large-scale systems of arbitrary scale, heterogeneity and complexity and capable of: 1) Providing stable, efficient and arbitrarily-close-to-optimal LSCS performance;2) Being able to incorporate a variety of constraints, including limited control constraints as well as constraints that are nonlinear functions of the system controls and states;3) Being intrinsically self-tunable, able to rapidly and efficiently optimize LSCS performance when short-, medium- or long-time variations affect the large-scale system;4) Achieving the above, while being scalable and modular. The purpose of the present paper is to provide the main features of the proposed control design methodology.
The main idea of this paper is an efficient power management mechanism in order to transmit to multiple receivers. The proposed mechanism consists of a module for efficiently managing the power when transmitting video...
详细信息
ISBN:
(纸本)9789532900217
The main idea of this paper is an efficient power management mechanism in order to transmit to multiple receivers. The proposed mechanism consists of a module for efficiently managing the power when transmitting video over wireless networks by using the TFRC protocol reports and then adjusts transmission power using a binary-like approach. In order to extend to multiple receivers, several methods are proposed for calculating an appropriate power transmission level based on all TFRC reports and adjust the server's transmission power accordingly.
In this paper, we describe a power management mechanism for wireless video transmission using the TFRC protocol that takes into account feedback about the received video quality and tries to intelligently adapt transm...
详细信息
The "small world" phenomenon, i.e., the fact that the global social network is strongly connected in the sense that every two persons are inter-related through a small chain of friends, has attracted researc...
详细信息
The project of E-Simulator is under way at Hyogo Earthquake engineering Research Center (E-Defense), which belongs to National Research inst.tute for Earth Science and Disaster Prevention (NIED), Japan. E-Defense faci...
详细信息
The project of E-Simulator is under way at Hyogo Earthquake engineering Research Center (E-Defense), which belongs to National Research inst.tute for Earth Science and Disaster Prevention (NIED), Japan. E-Defense facilitates the world's largest shaking table. The E-Simulator uses the parallel EF-analysis software package called ADVENTURECluster (ADVC) as a platform, and we carried out elastoplastic seismic response analysis of high-rise building frame with over 70-million DOFs. In this study, we report the results of high-precision FE-analysis for simulation of dynamic collapse behavior of the 4-story steel building frame. The whole frame is discretized into hexahedral elements with linear interpolation functions. In order to improve the accuracy of collapse simulation, a new piecewise linear combined isotropic and kinematic hardening rule is implemented for steel material, and its parameters are identified from the uniaxial material test result. The stud bolts are precisely modeled using multipoint constraints and nonlinear springs. The wire-meshes in the concrete slab are modeled using hexahedral elements. The damping due to plastic energy dissipation of exterior walls is modeled by shear springs between the floors. The accuracy of the model is verified in comparison to the physical test of steel-concrete composite beam subjected to static deformation. It will be shown that elastoplastic dynamic responses of the 4-story frame can be estimated with good accuracy using a high-precision FE-analysis without resort to macro-models such as plastic hinge and composite beam effect.
Classification algorithms are frequently used on data with a natural hierarchical structure. For inst.nce, classifiers are often trained and tested on trial-wise measurements, separately for each subject within a grou...
详细信息
Classification algorithms are frequently used on data with a natural hierarchical structure. For inst.nce, classifiers are often trained and tested on trial-wise measurements, separately for each subject within a group. One important question is how classification outcomes observed in individual subjects can be generalized to the population from which the group was sampled. To address this question, this paper introduces novel statistical models that are guided by three desiderata. First, all models explicitly respect the hierarchical nature of the data, that is, they are mixed-effects models that simultaneously account for within-subjects (fixed-effects) and across-subjects (random-effects) variance components. Second, maximum-likelihood estimation is replaced by full Bayesian inference in order to enable natural regularization of the estimation problem and to afford conclusions in terms of posterior probability statements. Third, inference on classification accuracy is complemented by inference on the balanced accuracy, which avoids inflated accuracy estimates for imbalanced data sets. We introduce hierarchical models that satisfy these criteria and demonstrate their advantages over conventional methods usingMCMC implementations for model inversion and model selection on both synthetic and empirical data. We envisage that our approach will improve the sensitivity and validity of statistical inference in future hierarchical classification studies.
Dynamic spectrum access has proposed tiering radios into two groups: Primary Users (PUs) and Secondary Users (SUs). PUs are assumed to have reserved spectrum available to them, while SUs (operating in overlay mode) mu...
详细信息
A hybrid abstraction of a full system simulation platform can provide flexible hardware-and-software co-verification and co-simulation in early stage of system-on-a-chip development. Being a hybrid abstraction for the...
详细信息
暂无评论