Owing to the computational density and complexity of vehicle applications, unique vehicle mobility and limited edge server resources, Vehicle Edge Computing (VEC) faces significant challenges. Unmanned Aerial Vehicles...
详细信息
Grid automation is widely used in finite element analysis, computational fluid dynamics and other fields. Its core lies in accurately dividing complex geometries into suitable grids for numerical calculations. To solv...
详细信息
This paper will improve the fundamental ant colony optimization algorithm in the context of mobile robot path planning in response to its flaws, which include easy descent into a local optimum, a large number of infle...
详细信息
To address the issues of unknown target size,blurred edges,background interference and low contrast in infrared small target detection,this paper proposes a method based on density peaks searching and weighted multi-f...
详细信息
To address the issues of unknown target size,blurred edges,background interference and low contrast in infrared small target detection,this paper proposes a method based on density peaks searching and weighted multi-feature local ***,an improved high-boost filter is used for preprocessing to eliminate background clutter and high-brightness interference,thereby increasing the probability of capturing real targets in the density peak ***,a triple-layer window is used to extract features from the area surrounding candidate targets,addressing the uncertainty of small target *** calculating multi-feature local differences between the triple-layer windows,the problems of blurred target edges and low contrast are *** balance the contribution of different features,intra-class distance is used to calculate weights,achieving weighted fusion of multi-feature local differences to obtain the weighted multi-feature local differences of candidate *** real targets are then extracted using the interquartile *** on datasets such as SIRST and IRSTD-IK show that the proposed method is suitable for various complex types and demonstrates good robustness and detection performance.
How to coordinate the design of sampling and Sparse-dense Matrix Multiplication (SpMM) is important in Graph Neural Network (GNN) acceleration. However, existing methods have an imbalance between accuracy and speed in...
详细信息
Scene text detection is an important task in computer *** this paper,we present YOLOv5 Scene Text(YOLOv5ST),an optimized architecture based on YOLOv5 v6.0 tailored for fast scene text *** primary goal is to enhance in...
详细信息
Scene text detection is an important task in computer *** this paper,we present YOLOv5 Scene Text(YOLOv5ST),an optimized architecture based on YOLOv5 v6.0 tailored for fast scene text *** primary goal is to enhance inference speed without sacrificing significant detection accuracy,thereby enabling robust performance on resource-constrained devices like drones,closed-circuit television cameras,and other embedded *** achieve this,we propose key modifications to the network architecture to lighten the original backbone and improve feature aggregation,including replacing standard convolution with depth-wise convolution,adopting the C2 sequence module in place of C3,employing Spatial Pyramid Pooling Global(SPPG)instead of Spatial Pyramid Pooling Fast(SPPF)and integrating Bi-directional Feature Pyramid Network(BiFPN)into the *** results demonstrate a remarkable 26%improvement in inference speed compared to the baseline,with only marginal reductions of 1.6%and 4.2%in mean average precision(mAP)at the intersection over union(IoU)thresholds of 0.5 and 0.5:0.95,*** work represents a significant advancement in scene text detection,striking a balance between speed and accuracy,making it well-suited for performance-constrained environments.
Temporal knowledge graph(TKG) reasoning, has seen widespread use for modeling real-world events, particularly in extrapolation settings. Nevertheless, most previous studies are embedded models, which require both enti...
详细信息
Temporal knowledge graph(TKG) reasoning, has seen widespread use for modeling real-world events, particularly in extrapolation settings. Nevertheless, most previous studies are embedded models, which require both entity and relation embedding to make predictions, ignoring the semantic correlations among different entities and relations within the same timestamp. This can lead to random and nonsensical predictions when unseen entities or relations occur. Furthermore, many existing models exhibit limitations in handling highly correlated historical facts with extensive temporal depth. They often either overlook such facts or overly accentuate the relationships between recurring past occurrences and their current counterparts. Due to the dynamic nature of TKG, effectively capturing the evolving semantics between different timestamps can be *** address these shortcomings, we propose the recurrent semantic evidenceaware graph neural network(RE-SEGNN), a novel graph neural network that can learn the semantics of entities and relations simultaneously. For the former challenge, our model can predict a possible answer to missing quadruples based on semantics when facing unseen entities or relations. For the latter problem, based on an obvious established force, both the recency and frequency of semantic history tend to confer a higher reference value for the current. We use the Hawkes process to compute the semantic trend, which allows the semantics of recent facts to gain more attention than those of distant facts. Experimental results show that RE-SEGNN outperforms all SOTA models in entity prediction on 6 widely used datasets, and 5 datasets in relation prediction. Furthermore, the case study shows how our model can deal with unseen entities and relations.
Code review is a critical process in software development, contributing to the overall quality of the product by identifying errors early. A key aspect of this process is the selection of appropriate reviewers to scru...
详细信息
Code review is a critical process in software development, contributing to the overall quality of the product by identifying errors early. A key aspect of this process is the selection of appropriate reviewers to scrutinize changes made to source code. However, in large-scale open-source projects, selecting the most suitable reviewers for a specific change can be a challenging task. To address this, we introduce the Code Context Based Reviewer Recommendation (CCB-RR), a model that leverages information from changesets to recommend the most suitable reviewers. The model takes into consideration the paths of modified files and the context derived from the changesets, including their titles and descriptions. Additionally, CCB-RR employs KeyBERT to extract the most relevant keywords and compare the semantic similarity across changesets. The model integrates the paths of modified files, keyword information, and the context of code changes to form a comprehensive picture of the changeset. We conducted extensive experiments on four open-source projects, demonstrating the effectiveness of CCB-RR. The model achieved a Top-1 accuracy of 60%, 55%, 51%, and 45% on the Android, OpenStack, QT, and LibreOffice projects respectively. For Mean Reciprocal Rank (MRR), CCB achieved 71%, 62%, 52%, and 68% on the same projects respectively, thereby highlighting its potential for practical application in code reviewer recommendation.
Electronic auctions(e-auctions)remove the physical limitations of traditional auctions and bring this mechanism to the general ***,most e-auction schemes involve a trusted auctioneer,which is not always credible in **...
详细信息
Electronic auctions(e-auctions)remove the physical limitations of traditional auctions and bring this mechanism to the general ***,most e-auction schemes involve a trusted auctioneer,which is not always credible in *** studies have applied cryptography tools to solve this problem by distributing trust,but they ignore the existence of *** this paper,a blockchain-based Privacy-Preserving and Collusion-Resistant scheme(PPCR)for double auctions is proposed by employing both cryptography and blockchain technology,which is the first decentralized and collusion-resistant double auction scheme that guarantees bidder anonymity and bid privacy.A two-server-based auction framework is designed to support off-chain allocation with privacy preservation and on-chain dispute resolution for collusion resistance.A Dispute Resolution agreement(DR)is provided to the auctioneer to prove that they have conducted the auction correctly and the result is fair and *** addition,a Concise Dispute Resolution protocol(CDR)is designed to handle situations where the number of accused winners is small,significantly reducing the computation cost of dispute *** experimental results confirm that PPCR can indeed achieve efficient collusion resistance and verifiability of auction results with low on-chain and off-chain computational overhead.
This study examines the effectiveness of artificial intelligence techniques in generating high-quality environmental data for species introductory site selection *** Strengths,Weaknesses,Opportunities,Threats(SWOT)ana...
详细信息
This study examines the effectiveness of artificial intelligence techniques in generating high-quality environmental data for species introductory site selection *** Strengths,Weaknesses,Opportunities,Threats(SWOT)analysis data with Variation Autoencoder(VAE)and Generative AdversarialNetwork(GAN)the network framework model(SAE-GAN),is proposed for environmental data *** model combines two popular generative models,GAN and VAE,to generate features conditional on categorical data embedding after SWOT *** model is capable of generating features that resemble real feature distributions and adding sample factors to more accurately track individual sample *** data is used to retain more semantic information to generate *** model was applied to species in Southern California,USA,citing SWOT analysis data to train the *** show that the model is capable of integrating data from more comprehensive analyses than traditional methods and generating high-quality reconstructed data from them,effectively solving the problem of insufficient data collection in development *** model is further validated by the Technique for Order Preference by Similarity to an Ideal Solution(TOPSIS)classification assessment commonly used in the environmental data *** study provides a reliable and rich source of training data for species introduction site selection systems and makes a significant contribution to ecological and sustainable development.
暂无评论