Graph-based data present unique challenges and opportunities for machine learning. Graph Neural Networks (GNNs), and especially those algorithms that capture graph topology through message passing for neighborhood agg...
详细信息
ISBN:
(纸本)9798350324457
Graph-based data present unique challenges and opportunities for machine learning. Graph Neural Networks (GNNs), and especially those algorithms that capture graph topology through message passing for neighborhood aggregation, have been a leading solution. However, these networks often require substantial computational resources and may not optimally leverage the information contained in the graph's topology, particularly for large-scale or complex *** propose Topology Coordinate Neural Network (TCNN) and Directional Virtual Coordinate Neural Network (DVCNN) as novel and efficient alternatives to message passing GNNs, that directly leverage the graph's topology, sidestepping the computational challenges presented by competing algorithms. Our proposed methods can be viewed as a reprise of classic techniques for graph embedding for neural network feature engineering, but they are novel in that our embedding techniques leverage ideas in Graph Coordinates (GC) that are lacking in current *** results, benchmarked against the Open Graph Benchmark Leaderboard, demonstrate that TCNN and DVCNN achieve competitive or superior performance to message passing GNNs. For similar levels of accuracy and ROC-AUC, TCNN and DVCNN need far fewer trainable parameters than contenders of the OGBN Leaderboard. The proposed TCNN architecture requires fewer parameters than any neural network method currently listed in the OGBN Leaderboard for both OGBN-Proteins and OGBN-Products datasets. Conversely, our methods achieve higher performance for a similar number of trainable parameters. These results hold across diverse datasets and edge features, underscoring the robustness and generalizability of our methods. By providing an efficient and effective alternative to message passing GNNs, our work expands the toolbox of techniques for graph-based machine learning. A significantly lower number of tunable parameters for a given evaluation metric makes TCNN and DVCNN especiall
Recently, the research on daily health monitoring using a wearable sensor has been continually evolving. In the future, when this system is actually implemented, a vast amount of data transmission will be conducted fr...
详细信息
The field of dermatology faces considerable challenges when it comes to early detection of skin cancer. Our study focused on using different datasets, including original data, augmented data, and SMOTE oversampled dat...
详细信息
The advancements in technology during the 20th century resulted in the onset of the digital computer era. This study investigates the relative importance of earlier algorithms in comparison to more recent ones. Variou...
The advancements in technology during the 20th century resulted in the onset of the digital computer era. This study investigates the relative importance of earlier algorithms in comparison to more recent ones. Various research projects suggest improving vectorization by combining conventional and modern techniques, while others suggest optimizing it through algorithmic methods. This study primarily focuses on employing Bayesian optimization to optimize hyperparameters, hence improving the performance evaluation of TF-IDF FastText sentiment classification models. This study presented four models: the initial model utilized FastText with a preprocessing step that involved removing stopwords, the second model employed Fast-Text without any optimization using Bayesian optimization, the third model utilized FastText with optimization using Bayesian optimization, and the fourth model combined FastText with TF-IDF and was further optimized using Bayesian optimization. The Support Vector Machine (SVM) technique will be employed to evaluate all of the models. The findings suggest that the model’s performance stays unchanged when stopwords are eliminated (Precision, Recall, F1: 0.9007). The model demonstrates a slight enhancement through the utilization of Bayesian optimization, resulting in a 0.02% increase, reaching a final accuracy of 0.9019%. Bayesian optimization is frequently employed to improve performance. The performance of Bayesian optimization is affected by the magnitude of the learning rate, and models that do not utilize Bayesian optimization demonstrate inferior performance. Both the TF-IDF FasText model and FastText model yielded comparable outcomes, attaining an F1 score of 0.9019 after being tuned by Bayesian optimization.
The measurement of information security risk in a public sector organization is of utmost importance. This measurement serves the purpose of taking appropriate actions in response to potential risks or damaging incide...
The measurement of information security risk in a public sector organization is of utmost importance. This measurement serves the purpose of taking appropriate actions in response to potential risks or damaging incidents. The objective of this study is to develop a straightforward yet smart decision model capable of evaluating risks, particularly within the public sector organization. The decision support modelling (DSM) concept is employed as a framework to construct a computer model that supports decision-making in a specific case. The study comprises five stages, which are integral parts of the DSM process. These stages include analyzing the case, examining relevant documents, designing the model, constructing the model, and evaluating the finalized model. The object-oriented method serves as a fundamental approach to model design. Additionally, the fuzzy logic method, an intelligent computational technique, plays a central role in the development of this decision model. The proposed model demonstrates an average error value of 0.05 when compared to the actual risk measurement conducted. Furthermore, it reveals average risk values of 0.59 and 0.45 for the pre- and post-remediation scenarios, respectively.
This study discusses the development of smart precision farming systems using big data and cloud-based intelligent decision support systems. Big data plays an important role in collecting, storing, and analyzing large...
This study discusses the development of smart precision farming systems using big data and cloud-based intelligent decision support systems. Big data plays an important role in collecting, storing, and analyzing large amounts of data from various sources related to agriculture, including data from weather stations, soil sensors, satellite imagery, crop yield records, pest and disease reports, and other sources. This study highlights the differences between smart farming and precision farming. This study describes key techniques and system architecture, including data collection, processing, analysis, and decision support components. Utilizing a cloud platform enables scalability and optimized performance, which lowers costs and makes it safer and easier to manage. The integration of big data and Alibaba cloud computing in smart precision farming can improve farming productivity by providing timely information and recommendations to farmers for better decision-making. Finally, the system produces smart precision farming, which provides cost-effective real-time monitoring and predictive analytics to increase agricultural production and sustainability.
We address the problem of learning new classes for semantic segmentation models from few examples, which is challenging because of the following two reasons. Firstly, it is difficult to learn from limited novel data t...
详细信息
Recent advances in deep learning not only facilitate the implementation of zero-shot singing voice synthesis (SVS) and singing voice conversion (SVC) tasks but also provide the opportunity to unify these two tasks int...
详细信息
An integrated approach is necessary for implementing a model that involves multiple actors. One effective way to achieve this is through a service-oriented approach. The objective of the study was to develop a service...
An integrated approach is necessary for implementing a model that involves multiple actors. One effective way to achieve this is through a service-oriented approach. The objective of the study was to develop a service-oriented fuzzy model, which combines the functional-structural plant modelling (FSPM) method for the plant computational model (PCM) and the fuzzy logic method for both PCM and the decision support model (DSM). This combined method aims to model the plant behaviour morphologically and propose investment decisions in agriculture, specifically for the hydroponic cultivation of Bok Choy, a green-leaf vegetable. The accuracy of the interconnected model application based on the service concept, known as the service-oriented fuzzy smart model (SOFSM), reached 94.33%.
The insurance industry faces a significant challenge concerning insurance claims, particularly due to the prevalence of fraudulent insurance claims. To address this issue, one potential solution is the implementation ...
The insurance industry faces a significant challenge concerning insurance claims, particularly due to the prevalence of fraudulent insurance claims. To address this issue, one potential solution is the implementation of a computer-based decision model. This research presents a fuzzy decision model based on object-oriented method development. The study involves seven stages (i.e. case analyzing, parameter analyzing, objects-parameters linking, detail object relation constructing, parameter exchange analyzing, OOFDM constructing, and model verifying and validating), with an object-oriented approach serving as the foundational method for constructing the model, and fuzzy logic as the primary method for assessing claim risks in proposing the best decision. The model has the capability to simulate insurance claims and offers objective decisions based on 19,611 claims data, categorizing them into two decision categories: acceptance and pending.
暂无评论