Functional dependencies (FDs) are the most common constraints in the design theory for relational databases, generalizing the concept of a key for a relation. Given an attribute subset X and an attribute A in relation...
With the continuous evolution of cloud computing technology, the memory requirements of cloud servers are increasing. Insufficient memory has become the performance bottleneck of many applications. Memory disaggregati...
详细信息
With the rapid development of Internet technology,the issues of network asset detection and vulnerability warning have become hot topics of concern in the ***,most existing detection tools operate in a single-node mod...
详细信息
With the rapid development of Internet technology,the issues of network asset detection and vulnerability warning have become hot topics of concern in the ***,most existing detection tools operate in a single-node mode and cannot parallelly process large-scale tasks,which cannot meet the current needs of the *** address the above issues,this paper proposes a distributed network asset detection and vulnerability warning platform(Dis-NDVW)based on distributed systems and multiple detection ***,this paper proposes a distributed message sub-scription and publication system based on Zookeeper and Kafka,which endows Dis-NDVW with the ability to parallelly process large-scale ***,Dis-NDVW combines the RangeAssignor,RoundRobinAssignor,and StickyAssignor algorithms to achieve load balancing of task nodes in a distributed detection *** terms of a large-scale task processing strategy,this paper proposes a task partitioning method based on First-In-First-Out(FIFO)*** method realizes the parallel operation of task producers and task consumers by dividing pending tasks into different queues according to task *** ensure the data reliability of the task cluster,Dis-NDVW provides a redundant storage strategy for master-slave partition *** terms of distributed storage,Dis-NDVW utilizes a distributed elastic storage service based on ElasticSearch to achieve distributed storage and efficient retrieval of big *** verification shows that Dis-NDVW can better meet the basic requirements of ultra-large-scale detection tasks.
As an essential tool for realistic description of the current or future debris environment,the Space Debris Environment Engineering Model(SDEEM)has been developed to provide support for risk assessment of *** contrast...
详细信息
As an essential tool for realistic description of the current or future debris environment,the Space Debris Environment Engineering Model(SDEEM)has been developed to provide support for risk assessment of *** contrast with SDEEM2015,SDEEM2019,the latest version,extends the orbital range from the Low Earth Orbit(LEO)to Geosynchronous Orbit(GEO)for the years *** this paper,improved modeling algorithms used by SDEEM2019 in propagating simulation,spatial density distribution,and spacecraft flux evaluation are *** debris fluxes of SDEEM2019 are compared with those of three typical models,i.e.,SDEEM2015,Orbital Debris Engineering Model 3.1(ORDEM 3.1),and Meteoroid and Space Debris Terrestrial Environment Reference(MASTER-8),in terms of two assessment *** orbital cases,including the Geostationary Transfer Orbit(GTO),Sun-Synchronous Orbit(SSO)and International Space Station(ISS)orbit,are selected for the spacecraft assessment mode,and the LEO region is selected for the spatial density assessment *** analysis indicates that compared with previous algorithms,the variable step-size orbital propagating algorithm based on semi-major axis control is more precise,the spatial density algorithm based on the second zonal harmonic of the non-spherical Earth gravity(J_(2))is more applicable,and the result of the position-centered spacecraft flux algorithm is more *** comparison shows that SDEEM2019 and MASTER-8 have consistent trends due to similar modeling processes,while the differences between SDEEM2019 and ORDEM 3.1 are mainly caused by different modeling approaches for uncatalogued debris.
Missing data in multivariate time series are common issues that can affect the analysis and downstream applications. Although multivariate time series data generally consist of the trend, seasonal and residual terms, ...
This paper addresses the limited interpretability of current deep learning-based fake news identification methods, which often fail to incorporate background knowledge embedded in the news. It leverages commonsense kn...
详细信息
With the escalation of global warming and human activities, large-scale wildfires have become increasingly frequent, posing significant threats to both ecological environments and human societal safety. Satellite remo...
详细信息
Real-world data always exhibit an imbalanced and long-tailed distribution,which leads to poor performance for neural network-based *** methods mainly tackle this problem by reweighting the loss function or rebalancing...
详细信息
Real-world data always exhibit an imbalanced and long-tailed distribution,which leads to poor performance for neural network-based *** methods mainly tackle this problem by reweighting the loss function or rebalancing the ***,one crucial aspect overlooked by previous research studies is the imbalanced feature space problem caused by the imbalanced angle *** this paper,the authors shed light on the significance of the angle distribution in achieving a balanced feature space,which is essential for improving model performance under long-tailed ***,it is challenging to effectively balance both the classifier norms and angle distribution due to problems such as the low feature *** tackle these challenges,the authors first thoroughly analyse the classifier and feature space by decoupling the classification logits into three key components:classifier norm(*** magnitude of the classifier vector),feature norm(*** magnitude of the feature vector),and cosine similarity between the classifier vector and feature *** this way,the authors analyse the change of each component in the training process and reveal three critical problems that should be solved,that is,the imbalanced angle distribution,the lack of feature discrimination,and the low feature *** from this analysis,the authors propose a novel loss function that incorporates hyperspherical uniformity,additive angular margin,and feature norm *** component of the loss function addresses a specific problem and synergistically contributes to achieving a balanced classifier and feature *** authors conduct extensive experiments on three popular benchmark datasets including CIFAR-10/100-LT,ImageNet-LT,and iNaturalist *** experimental results demonstrate that the authors’loss function outperforms several previous state-of-the-art methods in addressing the challenges posed by imbalanced and longtailed datasets,t
In Collaborative Intelligence (CI), the Artificial Intelligence (AI) model is divided between the edge and the cloud, with intermediate features being sent from the edge to the cloud for inference. Several deep learni...
详细信息
In visual object tracking via unmanned aerial vehicle (UAV), discriminative correlation filtering (DCF) is one of the major methods owing to circulant samples which can be utilized not only for computing economically ...
详细信息
暂无评论