Ms. Pac-Man was developed in the 1980s, becoming one of the most popular arcade games of its time. It still has a significant following today and has recently attracted the attention of artificial intelligence researc...
详细信息
The presence of valley networks (VN) on Mars suggests that early Mars was warmer and wetter than present. However, detailed geomorphic analyses of individual networks have not led to a consensus regarding their origin...
详细信息
The presence of valley networks (VN) on Mars suggests that early Mars was warmer and wetter than present. However, detailed geomorphic analyses of individual networks have not led to a consensus regarding their origin. An additional line of evidence can be provided by the global pattern of dissection on Mars, but the currently available global map of VN, compiled from Viking images, is incomplete and outdated. We created an updated map of VN by using a computer algorithm that parses topographic data and recognizes valleys by their morphologic signature. This computer-generated map was visually inspected and edited to produce the final updated map of VN. The new map shows an increase in total VN length by a factor of 2.3. A global map of dissection density, D, derived from the new VN map, shows that the most highly dissected region forms a belt located between the equator and mid-southern latitudes. The most prominent regions of high values of D are the northern Terra Cimmeria and the Margaritifer Terra where D reaches the value of 0.12 km(-1) over extended areas. The average value of D is 0.062 km(-1), only 2.6 times lower than the terrestrial value of D as measured in the same fashion. These relatively high values of dissection density over extensive regions of the planet point toward precipitation-fed runoff erosion as the primary mechanism of valley formation. Assuming a warm and wet early Mars, peculiarity of the global pattern of dissection is interpreted in the terms of climate controlling factors influenced by the topographic dichotomy.
Early detection of Alzheimer's Disease (AD), i.e. before symptom onset, would provide the opportunity for development and testing of interventions at earlier stages, when the disease process may still be altered o...
详细信息
Early detection of Alzheimer's Disease (AD), i.e. before symptom onset, would provide the opportunity for development and testing of interventions at earlier stages, when the disease process may still be altered or interrupted. computer algorithms combining machine learning with non-invasive imaging and other biomarkers for AD have been developed in an effort to improve early detection methods. However, so far, none of the individual algorithms perform at a level that qualifies for clinical use. In this study, we investigated whether combining several existing AD prediction algorithms improves performance and ***-of-the-art AD progression prediction algorithms were collected from the TADPOLE-SHARE project. algorithms were trained on data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study and made forecasts of the clinical diagnosis (CN, MCI, or AD). These algorithms were combined using i) simple, unlearned fuser methods and ii) learned fuser methods. In total, seven experiments were conducted, exploring different combination strategies with increasing complexity of fusers. Finally, we implemented and added our own individual algorithm, a residual neural network (ResNet). All individual algorithms and ensembles were evaluated with the multiclass area under the curve (mAUC) and the balanced classification accuracy (BCA) performance metrics. Statistical significance was evaluated with the McNemar ***. TADPOLE-SHARE resulted in the collection of eight algorithms, from which five were reused for combination. Overall, combining algorithms slightly improves performance (i.e. increased BCA and mAUC), although improvements were not statistically significant (McNemar test). Both BCA and mAUC showed a trend of improved performance with increasing fuser complexity i.e. data learned fusers and re-entering original data features. DoubleResNet was the best performing ensemble (BCA = 0.809 [±0.026], mAUC = 0.902 [±0.020]) and performed sli
1. A new method of computing disks subjected to tension in the elastoplastic region-the initial-parameter method-is proposed. 2. A computer algorithm realizing this method is distinguished by a high operating rate and...
详细信息
This research paper introduces a novel fractional Caputo-type simultaneous method for finding all simple and multiple roots of polynomial equations. Without any additional polynomial and derivative evaluations using s...
详细信息
This research paper introduces a novel fractional Caputo-type simultaneous method for finding all simple and multiple roots of polynomial equations. Without any additional polynomial and derivative evaluations using suitable correction, the order of convergence of the basic Aberth-Ehrlich simultaneous method has been increased from three to a + 3. In terms of accuracy, residual graph, computational efficiency and computation CPU time, the newly proposed families of simultaneous methods outperforms existing methods in numerical applications.
A simplified cylindrical model of an aircraft fuselage is used to investigate the mechanisms of interior noise suppression of the synchrophasing technique. This investigation allows isolation of important parameters t...
详细信息
This paper examines iterative methods for estimating missing values in a general designed experiment having a single error term in the analysis of variance. Both the method of Healy and Westmacott and the improved Hea...
详细信息
This paper examines iterative methods for estimating missing values in a general designed experiment having a single error term in the analysis of variance. Both the method of Healy and Westmacott and the improved Healy-Westmacott method of Pearce and others are identified as special cases of successive overrelaxation techniques used in the numerical solution of linear equations. The improved Healy-Westmacott method is shown to diverge under certain specified conditions. Optimal relaxation parameters are given which guarantee, and in some cases accelerate, convergence. Rates of convergence are compared for selected Latin square designs with missing data. A known disadvantage of iterative methods is their failure to give warning of the confounding which can arise from degenerative configurations of missing values. An extension of the iteration is suggested which enables such confounding to be detected.
One important step in the renormalization-group (RG) approach to a lattice sandpile model is the exact enumeration of all possible toppling processes of sandpile dynamics inside a cell for RG transformations. Here we ...
详细信息
One important step in the renormalization-group (RG) approach to a lattice sandpile model is the exact enumeration of all possible toppling processes of sandpile dynamics inside a cell for RG transformations. Here we propose a computer algorithm to carry out such exact enumeration for cells of planar lattices in the RG approach to the Bak-Tang-Wiesenfeld sandpile model [Phys. Rev. Lett. 59, 381 (1987)] and consider both the reduced-high RG equations proposed by Pietronero, Vespignani, and Zapperi (PVZ) [Phys. Rev. Lett. 72, 1690 (1994)], and the real-height RG equations proposed by Ivashkevich [Phys. Rev. Lett. 76, 3368 (1996)]. Using this algorithm, we are able to carry out RG transformations more quickly with large cell size, e.g., 3×3 cell for the square (SQ) lattice in PVZ RG equations, which is the largest cell size at the present, and find some mistakes in a previous paper [Phys. Rev. E 51, 1711 (1995)]. For SQ and plane triangular (PT) lattices, we obtain the only attractive fixed point for each lattice and calculate the avalanche exponent τ and the dynamical exponent z. Our results suggest that the increase of the cell size in the PVZ RG transformation does not lead to more accurate results. The implication of such result is discussed.
Background: Influenza is one of the oldest and deadliest infectious diseases known to man. Reassorted strains of the virus pose the greatest risk to both human and animal health and have been associated with all pande...
详细信息
Background: Influenza is one of the oldest and deadliest infectious diseases known to man. Reassorted strains of the virus pose the greatest risk to both human and animal health and have been associated with all pandemics of the past century, with the possible exception of the 1918 pandemic, resulting in tens of millions of deaths. We have developed and tested new computer algorithms, FluShuffle and FluResort, which enable reassorted viruses to be identified by the most rapid and direct means possible. These algorithms enable reassorted influenza, and other, viruses to be rapidly identified to allow prevention strategies and treatments to be more efficiently implemented. Results: The FluShuffle and FluResort algorithms were tested with both experimental and simulated mass spectra of whole virus digests. FluShuffle considers different combinations of viral protein identities that match the mass spectral data using a Gibbs sampling algorithm employing a mixed protein Markov chain Monte Carlo (MCMC) method. FluResort utilizes those identities to calculate the weighted distance of each across two or more different phylogenetic trees constructed through viral protein sequence alignments. Each weighted mean distance value is normalized by conversion to a Z-score to establish a reassorted strain. Conclusions: The new FluShuffle and FluResort algorithms can correctly identify the origins of influenza viral proteins and the number of reassortment events required to produce the strains from the high resolution mass spectral data of whole virus proteolytic digestions. This has been demonstrated in the case of constructed vaccine strains as well as common human seasonal strains of the virus. The algorithms significantly improve the capability of the proteotyping approach to identify reassorted viruses that pose the greatest pandemic risk.
暂无评论