A new algorithm has been designed and tested to identify protein, or any other macromolecular, complexes that have been widely reported in mass spectral data. The program takes advantage of the appearance of multiply ...
详细信息
A new algorithm has been designed and tested to identify protein, or any other macromolecular, complexes that have been widely reported in mass spectral data. The program takes advantage of the appearance of multiply charged ions that are common to both electrospray ionization and, to a lesser extent, matrix-assisted laser desorption/ionization (MALDI) mass spectra. The algorithm, known as COMPLX for the COMposition of Protein-Ligand compleXes, is capable of identifying complexes for any protein or macromolecule with a binding partner of molecular mass up to 100000 Da. It does so by identifying ion pairs present in a mass spectrum that, when they share a common charge, have an m/z value difference that is an integer fraction of a ligand or binding partner molecular mass. Several additional criteria must be met in order for the result to be ranked in the output file including that all m/z values for ions of the protein or complex have progressively lower values as their assigned charge increases, the difference between the m/z values for adjacent charge states (z, z+1) decrease as the assigned charge state increases, and the ratio of any two m/z values assigned to a protein or complex is equal to the inverse ratio of their charge., The entries that satisfy these criteria are then ranked according to the appearance of ions in the mass spectrum associated with the binding partner, the length of a continuous series of charges across any set of ions for a protein and complex and the lowest error recorded for the molecular mass of the ligand or binding partner. A diverse range of hypothetical and experimental mass spectral data were used to implement and test the program, including those recorded for antibody-peptide, protein-peptide and protein-heme complexes. Spectra of increasing complexity, in terms of the number of ions input, were also successfully analysed in which the number of input m/z values far exceeds the few associated with a macromolecular complex. Thus the
The reliability of network topology is examined using the Event Space Technique. An algorithm is developed to computerize the process of identifying optimal redundancy. A relationship between link reliability and netw...
详细信息
ISBN:
(纸本)0780381149
The reliability of network topology is examined using the Event Space Technique. An algorithm is developed to computerize the process of identifying optimal redundancy. A relationship between link reliability and network reliability is formulated. Cost-effectiveness was implemented for the optimum redundancy of an Optical Fibre Network (OFN) serving as a backbone to a mobile telecommunication system. The TIME dotCom (TDC) was taken as a model upon which the study is based.
Purpose: To develop a computer algorithm to measure myocardial infarct size in gadolinium-enhanced magnetic resonance (MR) imaging and to validate this method using a canine histopathological reference. Materials and ...
详细信息
Purpose: To develop a computer algorithm to measure myocardial infarct size in gadolinium-enhanced magnetic resonance (MR) imaging and to validate this method using a canine histopathological reference. Materials and Methods: Delayed enhancement MR was performed in 11 dogs with myocardial infarction (MI) determined by triphenyltetrazolium chloride (TTC). Infarct size on in vivo and ex vivo images was measured by a computer algorithm based on automated feature analysis and combined thresholding (FACT). For comparison, infarct size by human manual contouring and simple intensity thresholding (based on two standard deviation [2SD] and full width at half maximum [FWHM]) were studied. Results: Both in vivo and ex vivo MR infarct size measured by the FACT algorithm correlated well with TTC (R = 0.95-0.97) and showed no significant bias on Bland Altman analysis (P = not significant). Despite similar cor-relations (R = 0.91-0.97), human manual contouring overestimated in vivo MR infarct size by 5.4% of the left ventricular [LV] area (equivalent to 55.1% of the MI area) vs. TTC (P < 0.001). Infarct size measured by simple intensity thresholdings was less accurate than the proposed algorithm (P < 0.001 and P = 0.007). Conclusion: The FACT algorithm accurately measured MI size on delayed enhancement MR imaging in vivo and ex vivo. The FACT algorithm was also more accurate than human manual contouring and simple intensity thresholding approaches.
Gene expression analysis by differential display (DD) is limited by the labor-intensive visual evaluation of the electrophoretic data traces. We describe a flexible method for computer-assisted ranking of expression p...
详细信息
Gene expression analysis by differential display (DD) is limited by the labor-intensive visual evaluation of the electrophoretic data traces. We describe a flexible method for computer-assisted ranking of expression patterns in data from DD experiments. The method is based on a pairwise alignment and comparison of the quantitative trace data with respect to specific expression patterns defined by the investigator. The observed patterns are ranked according to a score value that identifies the most potential findings to be confirmed visually instead of the vast amount of original results. This two-step approach, enabled by the efficient computer algorithm for gene expression pattern comparison, will increase the percentage of true-positive findings chosen for the tedious downstream processing, while minimizing the cost and labor involved in large scale DD data analysis. less
Terminal restriction fragment length polymorphism (tRFLP) is a potentially high-throughput method for the analysis of complex microbial communities. Comparison of multiple tRFLP profiles to identify shared and unique ...
详细信息
Terminal restriction fragment length polymorphism (tRFLP) is a potentially high-throughput method for the analysis of complex microbial communities. Comparison of multiple tRFLP profiles to identify shared and unique components of microbial communities however, is done manually, which is both time consuming and error prone. This paper describes a freely accessible web-based program, T-Align (llttp://***/similar to talign/), which addresses this problem. Initially replicate profiles are compared and used to generate a single consensus profile containing only terminal restriction fragments that occur in all replicate profiles. Subsequently consensus profiles representing different communities are compared to produce a list showing whether a terminal restriction fragment (TRF) is present in a particular sample and its relative fluorescence intensity. The use of T-Align thus allows rapid comparison of numerous tRFLP profiles. T-Align is demonstrated by alignment of tRFLP profiles generated from bacterioplankton communities collected from the Irish and Celtic Seas in November 2000. Ubiquitous TRFs and site-specific TRFs were identified using T-Alien. (c) 2005 Federation of European Microbiological Societies. Published by Elsevier B.V. All rights reserved.
In order to increase the productivity of turning processes, several attempts have been made in the recent past for tool wear estimation and classification in turning operations. The tool flank and crater wear can be p...
详细信息
In order to increase the productivity of turning processes, several attempts have been made in the recent past for tool wear estimation and classification in turning operations. The tool flank and crater wear can be predicted by a number of models including statistical, pattern recognition, quantitative and neural network models. In this paper, a computer algorithm of new quantitative models for flank and crater wear estimation is presented. First, a quantitative model based on a correlation between increases in feed and radial forces and the average width of flank wear is developed. Then another model which relates acoustic emission (AE(rms)) in the turning operation with the flank and crater wear developed on the tool is presented. The flank wear estimated by the first model is then employed in the second model to predict the crater wear on the tool insert. The influence of flank and crater wear on AE(rms) generated during the turning operation has also been investigated. Additionally, chip-flow direction and tool-chip rake face interfacing area are also examined. The experimental results indicate that the computer program developed, based on the algorithm mentioned above, has a high accuracy for estimation of tool flank wear. (C) 2002 Elsevier Science Ltd. All rights reserved.
Objective: To construct and validate a computer instrument that identifies asthma patients receiving - theoretically suboptimal drug therapy in community pharmacies, by the use of patient medication records. This sele...
详细信息
Objective: To construct and validate a computer instrument that identifies asthma patients receiving - theoretically suboptimal drug therapy in community pharmacies, by the use of patient medication records. This selection enables the pharmacist to assist these patients in using medicines appropriately. Methods: According to Dutch asthma guidelines which describe a stepwise approach and in order to define correct profiles for the use at each level of these guidelines, the optimum use of drugs in the different levels in asthma treatment was expressed in defined daily doses (DDDs) per pharmacological drug-group during a period of one year. An algorithmic computer instrument was developed to select patients with medication use deviant from these profiles. By using nine different selection profiles, the computer instrument stratified patients according to the medication records filed in the pharmacy computer. Patient medication records in four community pharmacies were investigated to validate the selection profiles as indicators for theoretically suboptimal drug use by asthma patients. The validation was performed by comparing the professional judgement of participating pharmacists with the selections made by the computer. Main outcome measure: Positive predictive value and negative predictive value of the selection made by algorithmic computer instrument. Rate of false-positive results. Results: The computer instrument identified asthma patients using theoretically suboptimal drug therapy with approximately 95% predictive value compared with the professional judgement of the pharmacists. The rate of false-positive results was 5%. Conclusion: The results of the algorithmic computer instrument and the professional judgement of the pharmacists are in close agreement. The instrument will be utilised in further research in the IPMP study (Interventions on the principle of Pulmonary Medication Profiles) investigating the role of Dutch community pharmacists in counselling pa
Researchers have identified several problems in measuring the strongest path connecting pairs of actors in valued graphs. To address these problems, it has been proposed that average path value be used to indicate opt...
详细信息
Researchers have identified several problems in measuring the strongest path connecting pairs of actors in valued graphs. To address these problems, it has been proposed that average path value be used to indicate optimal connections between dyads. However, a lack of proper computer algorithm and its implementation has hindered a wide-range application of the proposed solution. In this paper we develop a computer algorithm and fully implement it with four JAVA programs, which are available on request. These programs produce an optimal connection matrix, which is subsequently inputted into UCINET for further multidimensional scaling and clustering analysis. We demonstrate this procedure with a data matrix containing 38 organizations in information technology. We discuss the methodological implications of the application of our algorithm to future social network studies.
PURPOSE: computer algorithms are often used for cardiac rhythm interpretation and are subsequently corrected by an overreading physician. The purpose of this study was to assess the incidence and clinical consequences...
详细信息
PURPOSE: computer algorithms are often used for cardiac rhythm interpretation and are subsequently corrected by an overreading physician. The purpose of this study was to assess the incidence and clinical consequences of misdiagnosis of atrial fibrillation based on a 12-lead electrocardiogram (ECG). METHODS: We retrieved 2298 ECGs with the computerized interpretation of atrial fibrillation from 1085 patients. The ECGs were reinterpreted to determine the accuracy of the interpretation. In patients in whom the interpretation was incorrect, we reviewed the medical records to assess the clinical consequences resulting from misdiagnosis. RESULTS: We found that 442 ECGs (19%) from 382 (35%) of the 1085 patients had been incorrectly interpreted as atrial fibrillation by the computer algorithm. In 92 patients (24%), the physician ordering the ECG had failed to correct the inaccurate interpretation, resulting in change in management and initiation of inappropriate treatment, including antiarrhythmic medications and anticoagulation in 39 patients (10%), as well as unnecessary additional diagnostic testing in 90 patients (24%). A final diagnosis of paroxysmal atrial fibrillation based on the initial incorrect interpretation of the ECGs was generated in 43 patients (11%). CONCLUSION: Incorrect computerized interpretation of atrial fibrillation, combined with the failure of the ordering physician to correct the erroneous interpretation, can result in the initiation of unnecessary, potentially harmful medical treatment as well as inappropriate use of medical resources. Greater efforts should be directed toward educating physicians about the electrocardiographic appearance of atrial dysrhythmias and in the recognition of confounding artifacts. (C) 2004 by Elsevier Inc.
Purpose: To study the frequency and severity of artifacts in optical Coherence tomography images and to develop a new algorithm for improved retinal thickness detection. Methods: We propose a new method to measure the...
详细信息
ISBN:
(纸本)0819452831
Purpose: To study the frequency and severity of artifacts in optical Coherence tomography images and to develop a new algorithm for improved retinal thickness detection. Methods: We propose a new method to measure the retinal thickness in OCT scans. We compared our modified edge detection (MED) method to the Markov method and the conventional OCT algorithm (cOCT) in 226 OCT macular scans. Results: We defined errors as a difference in detected interface location of less than 100 mum offset for less than 10 A-scans, otherwise it was an artifact. The frequency of errors was reduced from 32% (cOCT) to less than 2% with the MED method, while the Markov method had a frequency of 5%. Artifacts were reduced from 9.3% (cOCT) to 0.9% (MED) while the Markov method had a frequency of 11.5%. Conclusion: The results show the MED method of detecting retinal thickness is superior to the other two methods, since the OCT method is prone to both errors and artifacts and the Markov method is robust only to healthy retina. Our MED method is robust for detection of normal retinas and effective even in eyes with pathological conditions. Use of improved retinal thickness detection algorithm should significantly improve clinical utility of the optical coherence tomograph.
暂无评论