Cancer immunoediting reflects the role of the immune system in eliminating tumor cells and shaping tumor immunogenicity, which leaves marks in the genome. In this study, we systematically evaluate four methods for qua...
详细信息
Cancer immunoediting reflects the role of the immune system in eliminating tumor cells and shaping tumor immunogenicity, which leaves marks in the genome. In this study, we systematically evaluate four methods for quantifying immunoediting. In colorectal cancer samples from The Cancer Genome Atlas, we found that these methods identified 78.41%, 46.17%, 36.61%, and 4.92% of immunoedited samples, respectively, covering 92.90% of all colorectal cancer samples. Comparison of 10 patient-derived xenografts (PDXs) with their original tumors showed that different methods identified reduced immune selection in PDXs ranging from 44.44% to 60.0%. The proportion of such PDX-tumor pairs increases to 77.78% when considering the union of results from multiple methods, indicating the complementarity of these methods. We find that observed-to-expected ratios highly rely on neoantigen selection criteria and reference datasets. In contrast, HLA-binding mutation ratio, immune dN/dS, and enrichment score of cancer cell fraction were less affected by these factors. Our findings suggest integration of multiple methods may benefit future immunoediting analyses.
In an era inundated with vast amounts of information, the imperative for efficient news classification is paramount. This research explores the sophisticated integration of neural networks and convolutional neural net...
详细信息
In an era inundated with vast amounts of information, the imperative for efficient news classification is paramount. This research explores the sophisticated integration of neural networks and convolutional neural networks (CNN) with Generative Pre-trained Transformers (GPT) to enhance the precision and efficacy of news categorization. The rapid digital dissemination of news necessitates advanced computational methodologies capable of accurate classification and event prediction that include finance and economic events. Leveraging recent advancements in machine learning and natural language processing (NLP), this study utilizes large language models (LLMs) such as GPT and BERT, known for their exceptional comprehension and generation of human-like text. Over 232 days, our methodology classified 33,979 news articles into Education & Learning, Health & Medicine, and Science & Technology, with further subcategorization into 32 distinct subcategories. For evaluation, a sample of 5000 articles was assessed using metrics such as True Positive (TP), True Negative (TN), False Positive (FP), False Negative (FN), Precision, Recall, and F1-Score. In comparison with the existing studies, the proposed method achieving significantly higher with average scores of 0.986 (Precision), 0.987 (Recall), and 0.987 (F1-Score). This research offers substantial practical contributions, providing detailed insights into news source contributions, effective anomaly detection, and predictive trend analysis using neural networks. The theoretical contributions are profound, demonstrating the mathematical integration of GPT with CNNs and recurrent neural networks. This integration advances computational news classification and exemplifies how sophisticated mathematical frameworks enhance large-scale text data analysis, marking a pivotal advancement in applying advanced computational methods in real-world scenarios.
New Testament studies has over the past years seen an increase in the use of digital methods, but some of the more advanced methods still lack proper integration. This article explores some of the advantages and disad...
详细信息
New Testament studies has over the past years seen an increase in the use of digital methods, but some of the more advanced methods still lack proper integration. This article explores some of the advantages and disadvantages in employing computational/algorithmic approaches, such as so-called semantic models of word embedding and topic modelling analysis. The article is structured into three main parts. The first part (1) introduces the reader to the field of computational studies in literary, historical, and religious research areas and outlines the computational methods, namely topic modelling and word embedding. The second part, (2) showcases two computational tools in analyzing New Testament narratives. The third part (3) discusses and compares how the methodology of applying computational techniques can maintain and advance a focus on the historical and literary context of New Testament texts. The specific problem the article addresses is how computational methods can be wielded and not sacrifice the contact to the text and the historical context. We argue that applying computational methods in New Testament hermeneutics necessarily involves methodological pros and cons. These computationally assisted analyses can be regarded as old wine in new wineskins-classic, hermeneutical questions can be posed with new methods.
This paper proposes an innovative approach to social science research based on quantum theory,integrating quantum probability,quantum game theory,and quantum statistical methods into a comprehensive interdisciplinary ...
详细信息
This paper proposes an innovative approach to social science research based on quantum theory,integrating quantum probability,quantum game theory,and quantum statistical methods into a comprehensive interdisciplinary framework for both theoretical and empirical *** study elaborates on how core quantum concepts such as superposition,interference,and measurement collapse can be applied to model social decision making,cognition,and *** quantum computational methods and algorithms are employed to transition from theoretical model development to simulation and experimental *** case studies in international relations,economic games,and political decision making,the research demonstrates that quantum models possess significant advantages in explaining irrational and context-dependent behaviors that traditional methods often fail to *** paper also explores the potential applications of quantum social science in policy formulation and public decision making,addresses the ethical,privacy,and social equity challenges posed by quantum artificial intelligence,and outlines future research directions at the convergence of quantum AI,quantum machine learning,and big data *** findings suggest that quantum social science not only offers a novel perspective for understanding complex social phenomena but also lays the foundation for more accurate and efficient systems in social forecasting and decision support.
In this article, we develop a computational method for an algorithmic process first posed by Polyrakis in 1996 in order to check whether a finite collection of linearly independent positive functions in C[a, b] forms ...
详细信息
In this article, we develop a computational method for an algorithmic process first posed by Polyrakis in 1996 in order to check whether a finite collection of linearly independent positive functions in C[a, b] forms a lattice-subspace. Lattice-subspaces are closely related to a cost minimization problem in the theory of finance that ensures the minimum-cost insured portfolio and this connection is further investigated here. Finally, we propose a computational method in order to solve the minimization problem and to calculate the minimum-cost insured portfolio. All of the numerical work is performed using the Matlab high-level language. (C) 2007 Elsevier Inc. All rights reserved.
Treatment of many human diseases involves small-molecule drugs. Some target proteins, however, are not druggable with traditional strategies. Innovative RNA-targeted therapeutics may overcome such a challenge. Long no...
详细信息
Treatment of many human diseases involves small-molecule drugs. Some target proteins, however, are not druggable with traditional strategies. Innovative RNA-targeted therapeutics may overcome such a challenge. Long noncoding RNAs (lncRNAs) are transcribed RNAs that do not translate into proteins. Their ability to interact with DNA, RNA, microRNAs (miRNAs), and proteins makes them an interesting target for regulating gene expression and signaling pathways. In the past decade, a catalog of lncRNAs has been studied in several human diseases. One of the challenges with lncRNA studies include their lack of coding potential, making, it difficult to characterize them in wet-lab experiments functionally. Several computational tools have thus been designed to characterize functions of lncRNAs centered around lncRNA interaction with proteins and RNA, especially miRNAs. This review comprehensively summarizes the methods and tools for lncRNA-RNA interactions and lncRNA-protein interaction *** discuss the tools related to lncRNA interaction prediction using commonly-used models: ensemble-based, machine-learning-based, molecular-docking and network-based computational models. In biology, two or more genes co-expressed tend to have similar functions. Co-expression network analysis is, therefore, one of the most widely-used methods for understanding the function of lncRNAs. A major focus of our study is to compile literature related to the functional prediction of lncRNAs in human diseases using co-expression network analysis. In summary, this article provides relevant information on the use of appropriate computational tools for the functional characterization of lncRNAs that help wet-lab researchers design mechanistic and functional experiments.
Electron tomography (ET) has emerged as a powerful technique to address fundamental questions in molecular and cellular biology. It makes possible visualization of the molecular architecture of complex viruses, organe...
详细信息
Electron tomography (ET) has emerged as a powerful technique to address fundamental questions in molecular and cellular biology. It makes possible visualization of the molecular architecture of complex viruses, organelles and cells at a resolution of a few nanometres. In the last decade ET has allowed major breakthroughs that have provided exciting insights into a wide range of biological processes. In ET the biological sample is imaged with an electron microscope, and a series of images is taken from the sample at different views. Prior to imaging, the sample has to be specially prepared to withstand the conditions within the microscope. Subsequently, those images are processed and combined to yield the three-dimensional reconstruction or tomogram. Afterwards, a number of computational steps are necessary to facilitate the interpretation of the tomogram, such as noise reduction, segmentation and analysis of subvolumes. As the computational demands are huge in some of the stages, high performance computing (HPC) techniques are used to make the problem affordable in reasonable time. This article intends to comprehensively review the methods, technologies and tools involved in the different computational stages behind structural studies by Er, from image acquisition to interpretation of tomograms. The HPC techniques usually employed to cope with the computational demands are also briefly described. (C) 2012 Elsevier Ltd. All rights reserved.
This paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute component and system reliability and reliability sensitivities. The AIS approach uses a sampling density that is ...
详细信息
This paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute component and system reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint probability density function of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally to reach a sampling domain that is slightly greater than the failure domain to minimize oversampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the previous AIS-based failure points. These sensitivities can be used to identify key random variables and to adjust a design to achieve reliability-based objectives. The proposed methodology is demonstrated using a turbine blade reliability analysis problem.
Electron tomography (ET) is a powerful imaging technique that enables thorough three-dimensional (3D) analysis of materials at the nanometre and even atomic level. The recent technical advances have established ET as ...
详细信息
Electron tomography (ET) is a powerful imaging technique that enables thorough three-dimensional (3D) analysis of materials at the nanometre and even atomic level. The recent technical advances have established ET as an invaluable tool to carry out detailed 3D morphological studies and derive quantitative structural information. Originally from life sciences, ET was rapidly adapted to this field and has already provided new and unique insights into a variety of materials. The principles of ET are based on the acquisition of a series of images from the sample at different views, which are subsequently processed and combined to yield the 3D volume or tomogram. Thereafter, the tomogram is subjected to 3D visualization and post-processing for proper interpretation. Computation is of utmost importance throughout the process and the development of advanced specific methods is proving to be essential to fully take advantage of ET in materials science. This article aims to comprehensively review the computational methods involved in these ET studies, from image acquisition to tomogram interpretation, with special focus on the emerging methods. (C) 2013 Elsevier Ltd. All rights reserved.
The cellular automaton model of computation has drawn the interest of researchers from different disciplines including computer science, biology, mathematics, economy, biochemistry and philosophy. Although a cellular ...
详细信息
The cellular automaton model of computation has drawn the interest of researchers from different disciplines including computer science, biology, mathematics, economy, biochemistry and philosophy. Although a cellular automaton is based on a set of simple rules, over time complex patterns may evolve. We present computational methods for implementing and optimizing a well known two-state cellular automaton, Conway's Game of Life, on a 16-core Intel Xeon. The evaluation is based on three multicore algorithms. The first algorithm is coherent and utilizes shared memory and barrier synchronization. The remaining two algorithms are distributed and utilize private memories and explicit core-to-core message passing. We provide a link to our open source simulation software. (C) 2013 Elsevier B.V. All rights reserved.
暂无评论