Classification and regression algorithms based on k-nearest neighbors (kNN) are often ranked among the top-10 Machine learning algorithms, due to their performance, flexibility, interpretability, non-parametric nature...
详细信息
Classification and regression algorithms based on k-nearest neighbors (kNN) are often ranked among the top-10 Machine learning algorithms, due to their performance, flexibility, interpretability, non-parametric nature, and computational efficiency. Nevertheless, in existing kNN algorithms, the kNN radius, which plays a major role in the quality of kNN estimates, is independent of any weights associated with the training samples in a kNN-neighborhood. This omission, besides limiting the performance and flexibility of kNN, causes difficulties in correcting for covariate shift (e.g., selection bias) in the training data, taking advantage of unlabeled data, domain adaptation and transfer learning. We propose a new weighted kNN algorithm that, given training samples, each associated with two weights, called consensus and relevance (which may depend on the query on hand as well), and a request for an estimate of the posterior at a query, works as follows. First, it determines the kNN neighborhood as the training samples within the kth relevance-weighted order statistic of the distances of the training samples from the query. Second, it uses the training samples in this neighborhood to produce the desired estimate of the posterior (output label or value) via consensus-weighted aggregation as in existing kNN rules. Furthermore, we show that kNN algorithms are affected by covariate shift, and that the commonly used sample reweighing technique does not correct covariate shift in existing kNN algorithms. We then show how to mitigate covariate shift in kNN decision rules by using instead our proposed consensus-relevance kNN algorithm with relevance weights determined by the amount of covariate shift (e.g., the ratio of sample probability densities before and after the shift). Finally, we provide experimental results, using 197 real datasets, demonstrating that the proposed approach is slightly better (in terms of F-1 score) on average than competing benchmark approaches for mit
Weather variability significantly impacts crop yield, posing challenges for large-scale agricultural operations. This study introduces a deep learning-based approach to enhance crop yield prediction accuracy. A Multi-...
详细信息
The right partner and high innovation speed are crucial for a successful research and development (R&D) alliance in the high-tech industry. Does homogeneity or heterogeneity between partners benefit innovation spe...
详细信息
Extreme events jeopardize power network operations, causing beyond-design failures and massive supply interruptions. Existing market designs fail to internalize and systematically assess the risk of extreme and rare e...
详细信息
In pursuit of enhancing the Wireless Sensor Networks(WSNs)energy efficiency and operational lifespan,this paper delves into the domain of energy-efficient routing ***,the limited energy resources of Sensor Nodes(SNs)a...
详细信息
In pursuit of enhancing the Wireless Sensor Networks(WSNs)energy efficiency and operational lifespan,this paper delves into the domain of energy-efficient routing ***,the limited energy resources of Sensor Nodes(SNs)are a big challenge for ensuring their efficient and reliable *** data gathering involves the utilization of a mobile sink(MS)to mitigate the energy consumption problem through periodic network *** mobile sink(MS)strategy minimizes energy consumption and latency by visiting the fewest nodes or predetermined locations called rendezvous points(RPs)instead of all cluster heads(CHs).CHs subsequently transmit packets to neighboring *** unique determination of this study is the shortest path to reach *** the mobile sink(MS)concept has emerged as a promising solution to the energy consumption problem in WSNs,caused by multi-hop data collection with static *** this study,we proposed two novel hybrid algorithms,namely“ Reduced k-means based on Artificial Neural Network”(RkM-ANN)and“Delay Bound Reduced kmeans with ANN”(DBRkM-ANN)for designing a fast,efficient,and most proficient MS path depending upon rendezvous points(RPs).The first algorithm optimizes the MS’s latency,while the second considers the designing of delay-bound paths,also defined as the number of paths with delay over bound for the *** methods use a weight function and k-means clustering to choose RPs in a way that maximizes efficiency and guarantees network-wide *** addition,a method of using MS scheduling for efficient data collection is *** simulations and comparisons to several existing algorithms have shown the effectiveness of the suggested methodologies over a wide range of performance indicators.
Advancements in smart applications highlight the need for increased processing and storage capacity at Smart Devices (SDs). To tackle this, Edge computing (EC) is enabled to offload SD workloads to distant edge server...
详细信息
The segmentation of head and neck(H&N)tumors in dual Positron Emission Tomography/Computed Tomogra-phy(PET/CT)imaging is a critical task in medical imaging,providing essential information for diagnosis,treatment p...
详细信息
The segmentation of head and neck(H&N)tumors in dual Positron Emission Tomography/Computed Tomogra-phy(PET/CT)imaging is a critical task in medical imaging,providing essential information for diagnosis,treatment planning,and outcome *** by the need for more accurate and robust segmentation methods,this study addresses key research gaps in the application of deep learning techniques to multimodal medical ***,it investigates the limitations of existing 2D and 3D models in capturing complex tumor structures and proposes an innovative 2.5D UNet Transformer model as a *** primary research questions guiding this study are:(1)How can the integration of convolutional neural networks(CNNs)and transformer networks enhance segmentation accuracy in dual PET/CT imaging?(2)What are the comparative advantages of 2D,2.5D,and 3D model configurations in this context?To answer these questions,we aimed to develop and evaluate advanced deep-learning models that leverage the strengths of both CNNs and *** proposed methodology involved a comprehensive preprocessing pipeline,including normalization,contrast enhancement,and resampling,followed by segmentation using 2D,2.5D,and 3D UNet Transformer *** models were trained and tested on three diverse datasets:HeckTor2022,AutoPET2023,and *** was assessed using metrics such as Dice Similarity Coefficient,Jaccard Index,Average Surface Distance(ASD),and Relative Absolute Volume Difference(RAVD).The findings demonstrate that the 2.5D UNet Transformer model consistently outperformed the 2D and 3D models across most metrics,achieving the highest Dice and Jaccard values,indicating superior segmentation *** instance,on the HeckTor2022 dataset,the 2.5D model achieved a Dice score of 81.777 and a Jaccard index of 0.705,surpassing other model *** 3D model showed strong boundary delineation performance but exhibited variability across datasets,while the
There are numerous energy minimisation plans that are adopted in today’s data centres (DCs). The highest important ones are those that depend on switching off unused physical machines (PMs). This is usually done by o...
详细信息
All wireless communication systems are moving towards higher and higher frequencies day by day which are severely attenuated by rains in outdoor environment. To design a reliable RF system, an accurate prediction meth...
Delay/disruption tolerant networking(DTN) is proposed as a networking architecture to overcome challenging space communication characteristics for reliable data transmission service in presence of long propagation del...
详细信息
Delay/disruption tolerant networking(DTN) is proposed as a networking architecture to overcome challenging space communication characteristics for reliable data transmission service in presence of long propagation delays and/or lengthy link disruptions. Bundle protocol(BP) and Licklider Transmission Protocol(LTP) are the main key technologies for DTN. LTP red transmission offers a reliable transmission mechanism for space networks. One of the key metrics used to measure the performance of LTP in space applications is the end-to-end data delivery delay, which is influenced by factors such as the quality of spatial channels and the size of cross-layer packets. In this paper, an end-to-end reliable data delivery delay model of LTP red transmission is proposed using a roulette wheel algorithm, and the roulette wheel algorithm is more in line with the typical random characteristics in space networks. The proposed models are validated through real data transmission experiments on a semi-physical testing platform. Furthermore, the impact of cross-layer packet size on the performance of LTP reliable transmission is analyzed, with a focus on bundle size, block size, and segment size. The analysis and study results presented in this paper offer valuable contributions towards enhancing the reliability of LTP transmission in space communication scenarios.
暂无评论