Many observations in the biological field can adequately be fitted to an exponential regression equation, commonly in the form y = alpha + beta e(gamma x). However, the exact confidential limits of the parameters rema...
详细信息
Many observations in the biological field can adequately be fitted to an exponential regression equation, commonly in the form y = alpha + beta e(gamma x). However, the exact confidential limits of the parameters remain a problem. An algorithm to solve this problem is proposed here and tested with experimental data obtained from ventricular pressure measurements in isolated working small animal hearts. A subroutine based on this algorithm is written in Borland's Turbo-Pascal but easily portable to other languages. (C) 1997 Elsevier Science Ireland Ltd.
Background. In large-scale epidemiological studies of stillbirths and neonatal deaths a method is needed to replace detailed medical record audits in order to determine the cause of death. Methods. A computer-based me...
详细信息
Background. In large-scale epidemiological studies of stillbirths and neonatal deaths a method is needed to replace detailed medical record audits in order to determine the cause of death. Methods. A computer-based method is presented for determination of the cause of death in stillbirths and in neonatal deaths. It utilizes information in the Swedish medical registries. The study comprises 6044 dead infants born in Sweden from 1983-1990. For each infant the program determines 31 basic characteristics which are important in deciding the cause of death. Based on these characteristics a modified Wigglesworth's classification is used to find the cause of death. The validity of the method was checked by comparing the computer generated information with information obtained by scrutinizing medical records for a 10% representative sample (603 infants). Results. Specificity and sensitivity for each basic characteristic varied, but for the modified Wigglesworth cause of death classification the concordance was 88%. The weakest data refer to intrauterine deaths, where pertinent information was often missing in the medical registries. Conclusion. The method can be used for large-scale epidemiological studies.
In contrast with igneous and metamorphic rocks, classical petrochemical calculation methods cannot be used for tropical weathering components (saprolite, bauxitic, ferruginous, siliceous and calcareous laterite) in co...
详细信息
In contrast with igneous and metamorphic rocks, classical petrochemical calculation methods cannot be used for tropical weathering components (saprolite, bauxitic, ferruginous, siliceous and calcareous laterite) in converting whole-rock chemical analyses into normative mineralogical weight percentages. Weathering profiles are characterized by a mixture of primary and secondary minerals, which are not considered in the classical methods of mineralogical norm calculation. A new petrochemical calculation algorithm is proposed for the conversion of whole-rock chemical analyses into weathering norm (WN) for several components of the tropical weathering profiles. The normative minerals are represented by three primary minerals, six secondary minerals, four primary/secondary mineral pairs, and five minerals which can have both primary and secondary origin. This algorithm has been used in MINNOR, a WINDOWS application written in Visual Basic, which calculates the mineralogical norm. In order to test the program, several types of different chemical weathering profiles from South America and Africa have been selected. Special attention is paid to the weathering profile from Omai, Guyana, South America.
We compare the number of species represented and the spatial pattern of reserve networks derived using five types of reserve selection algorithms on a set of vertebrate distribution data for the Stare of Oregon (USA)....
详细信息
We compare the number of species represented and the spatial pattern of reserve networks derived using five types of reserve selection algorithms on a set of vertebrate distribution data for the Stare of Oregon (USA). The algorithms compared are: richness-based heuristic algorithms (four variations), weighted rarity-based heuristic algorithms (two variations), progressive rarity-based heuristic algorithms (11 variations), simulated annealing, and a linear programming-based branch-and-bound algorithm. The linear programming algorithm provided optimal solutions to the reserve selection problem, finding either the maximum number of species for a given number of sites or the minimum number of sites needed to represent all species. Where practical, we recommend the use of linear programming algorithms for reserve network selection. However, several simple heuristic algorithms provided near-optimal solutions for these data. The near-optimality, speed and simplicity of heuristic algorithms suggests that they are acceptable alternatives for many reserve selection problems, especially when dealing with large data sets or complicated analyses. (C) 1997 Published by Elsevier Science Ltd
NEXUS is a file format designed to contain systematic data for use by computer programs. The goals of the format are to allow future expansion, to include diverse kinds of information, to be independent of particular ...
详细信息
NEXUS is a file format designed to contain systematic data for use by computer programs. The goals of the format are to allow future expansion, to include diverse kinds of information, to be independent of particular computer operating systems, and to be easily processed by a program. To this end, the format is modular, with a file consisting of separate blocks, each containing one particular kind of information, and consisting of standardized commands. Public blocks (those containing information utilized by several programs) house information about taxa, morphological and molecular characters, distances, genetic codes, assumptions, sets, trees, etc.;private blocks contain information of relevance to single programs. A detailed description of commands in public blocks is given. Guidelines are provided for reading and writing NEXUS files and for extending the format.
Sports anemia provoked by iron deficiency develops slowly and although the existence or reality of this condition is under discussion, the ferritin levels tend to reduce with sports practice. This paper analyzes the v...
详细信息
Sports anemia provoked by iron deficiency develops slowly and although the existence or reality of this condition is under discussion, the ferritin levels tend to reduce with sports practice. This paper analyzes the variations of iron metabolism, including the organism stores of this metal throughout a sports season in a group of professional sportsmen (soccer players) belonging to a team of the Spanish First Division. For the determination of the iron stores, a computer program has been developed that takes into account the ferritin concentrations, hemoglobin, and saturation of transferrin. The results show that at the end of the season, when the ferric supplementation, which had been performed at the other moments of analysis, was no longer administered the iron stores reduced significantly, as well as the serum ferritin concentration, without such decreases being considered as a prelatent (grade I) anemia. (C) 1997 Elsevier Science Inc.
A method is reviewed for conversion of a microphone signal into calibrated Sound Pressure Level (SPL) units. The method follows American National Standards Institute (ANSI) standard for S1.4 SPL meters and requires an...
详细信息
A method is reviewed for conversion of a microphone signal into calibrated Sound Pressure Level (SPL) units. The method follows American National Standards Institute (ANSI) standard for S1.4 SPL meters and requires an accurate SPL meter and an accurate calibration sound source for conversion. Accuracy and validation data from test signals and human phonation are provided. The results indicate that under typical speech conditions, an absolute accuracy of plus or minus 1.6 dB (type 1 SPL meter) can be obtained with a miniature head-mounted microphone.
The use of confidence intervals to estimate population parameters is briefly reviewed. Exact binomial confidence intervals can be calculated through the use of tables or statistical software packages. As an alternativ...
详细信息
The use of confidence intervals to estimate population parameters is briefly reviewed. Exact binomial confidence intervals can be calculated through the use of tables or statistical software packages. As an alternative, a microcomputer program to calculate sensitivity and specificity, point estimates and binomial confidence intervals for false-negative and -positive rate, positive and negative predictive power, prevalence of cases and non-cases, correct classification rate, and misclassification rate has also been developed. Characteristics of the computer program, 'AccuCon', which is available from the authors, are described. (C) 1997 Elsevier Science Ireland Ltd.
We discuss the use of regression diagnostics combined with nonlinear least-squares to refine cell parameters from powder diffraction data, presenting a method which minimizes residuals in the experimentally-determined...
详细信息
We discuss the use of regression diagnostics combined with nonlinear least-squares to refine cell parameters from powder diffraction data, presenting a method which minimizes residuals in the experimentally-determined quantity (usually 2 theta(hkl) or energy, E(hkl)). Regression diagnostics, particularly deletion diagnostics, are invaluable in detection of outliers and influential data which could be deleterious to the regressed results. The usual practice of simple inspection of calculated residuals alone often fails to detect the seriously deleterious outliers in a dataset, because bare residuals provide no information on the leverage (sensitivity) of the datum concerned. The regression diagnostics which predict the change expected in each cell constant upon deletion of each observation (hkl reflection) are particularly valuable in assessing the sensitivity of the calculated results to individual reflections. A new computer program, implementing nonlinear regression methods and providing the diagnostic output, is described.
We have developed a least-squares refinement procedure that in an automated way performs three-dimensional alignment and averaging of objects from multiple reconstructions. The computer implementation aligns the three...
详细信息
We have developed a least-squares refinement procedure that in an automated way performs three-dimensional alignment and averaging of objects from multiple reconstructions. The computer implementation aligns the three-dimensional structures by a two-step procedure that maximizes the density overlap for all objects. First, an initial average density is built by successive incorporation of individual objects, after a global search for their optimal three-dimensional orientations. Second, the initial average is subsequently refined by excluding individual objects one at a time, realigning them with the reduced average containing all other objects and including them into the average again. The refinement is repeated until no further change of the average occurs. The resulting average model is therefore minimally biased by the order in which the individual reconstructions are incorporated into the average. The performance of the procedure was tested using a synthetic data set of randomly oriented objects with Poisson-distributed noise added. The program managed well to align and average the objects at the signal/noise ratio 1.0. The increase in signal/noise ratio was in all investigated cases almost equal to the expected square root of the number of objects. The program was also successfully tested on a set of authentic three-dimensional reconstructions from an in situ specimen containing Escherichia coli 70S ribosomes, where the immediate environment of the reconstructed objects may also contain variable amounts of other structures. (C) 1997 Academic Press.
暂无评论