Nowadays,Internet of Things(IoT)is widely deployed and brings great opportunities to change people's daily *** realize more effective human-computer interaction in the IoT applications,the Question Answering(QA)sy...
详细信息
Nowadays,Internet of Things(IoT)is widely deployed and brings great opportunities to change people's daily *** realize more effective human-computer interaction in the IoT applications,the Question Answering(QA)systems implanted in the IoT services are supposed to improve the ability to understand natural ***,the distributed representation of words,which contains more semantic or syntactic information,has been playing a more and more important role in the QA ***,learning high-quality distributed word vectors requires lots of storage and computing resources,hence it cannot be deployed on the resource-constrained IoT *** is a good choice to outsource the data and computation to the cloud ***,it could cause privacy risks to directly upload private data to the untrusted ***,realizing the word vector learning process over untrusted cloud servers without privacy leakage is an urgent and challenging *** this paper,we present a novel efficient word vector learning scheme over encrypted *** first design a series of arithmetic computation *** we use two non-colluding cloud servers to implement high-quality word vectors learning over encrypted *** proposed scheme allows us to perform training word vectors on the remote cloud servers while protecting *** analysis and experiments over real data sets demonstrate that our scheme is more secure and efficient than existing privacy-preserving word vector learning schemes.
The rapid accumulation of bigdata in the Internet era has gradually decelerated the progress of Artificial Intelligence(AI).As Moore’s Law approaches its limit,it is imperative to break the constraints that are hold...
详细信息
The rapid accumulation of bigdata in the Internet era has gradually decelerated the progress of Artificial Intelligence(AI).As Moore’s Law approaches its limit,it is imperative to break the constraints that are holding back artificial *** computing and artificial intelligence have been advancing along the highway of human civilization for many years,emerging as new engines driving economic and social *** article delves into the integration of quantum computing and artificial intelligence in both research and *** introduces the capabilities of both universal quantum computers and special-purpose quantum computers that leverage quantum *** discussion further explores how quantum computing enhances classical artificial intelligence from four perspectives:quantum supervised learning,quantum unsupervised learning,quantum reinforcement learning,and quantum deep *** an effort to address the limitations of smart cities,this article explores the formidable potential of quantum artificial intelligence in the realm of smart *** does so by examining aspects such as intelligent transportation,urban operation assurance,urban planning,and information communication,showcasing a plethora of practical achievements in the *** the foreseeable future,Quantum Artificial Intelligence(QAI)is poised to bring about revolutionary development to smart *** urgency lies in developing quantum artificial intelligence algorithms that are compatible with quantum computers,constructing an efficient,stable,and adaptive hybrid computing architecture that integrates quantum and classical computing,preparing quantum data as needed,and advancing controllable qubit hardware equipment to meet actual *** ultimate goal is to shape the next generation of artificial intelligence that possesses common sense cognitive abilities,robustness,excellent generalization capabilities,and interpretability.
Large-scale graphs usually exhibit global sparsity with local cohesiveness,and mining the representative cohesive subgraphs is a fundamental problem in graph *** k-truss is one of the most commonly studied cohesive su...
详细信息
Large-scale graphs usually exhibit global sparsity with local cohesiveness,and mining the representative cohesive subgraphs is a fundamental problem in graph *** k-truss is one of the most commonly studied cohesive subgraphs,in which each edge is formed in at least k 2 triangles.A critical issue in mining a k-truss lies in the computation of the trussness of each edge,which is the maximum value of k that an edge can be in a *** works mostly focus on truss computation in static graphs by sequential ***,the graphs are constantly changing dynamically in the real *** study distributed truss computation in dynamic graphs in this *** particular,we compute the trussness of edges based on the local nature of the k-truss in a synchronized node-centric distributed *** decomposing the trussness of edges by relying only on local topological information is possible with the proposed distributed decomposition ***,the distributed maintenance algorithm only needs to update a small amount of dynamic information to complete the *** experiments have been conducted to show the scalability and efficiency of the proposed algorithm.
As a form of a future traffic system,a connected and automated vehicle(CAV)platoon is a typical nonlinear physical *** can communicate with each other and exchange ***,communication failures can change the platoon sys...
详细信息
As a form of a future traffic system,a connected and automated vehicle(CAV)platoon is a typical nonlinear physical *** can communicate with each other and exchange ***,communication failures can change the platoon system *** characterize this change,a dynamic topology-based car-following model and its generalized form are proposed in this ***,a stability analysis method is ***,taking the dynamic cooperative intelligent driver model(DC-IDM)for example,a series of numerical simulations is conducted to analyze the platoon stability in different communication topology *** results show that the communication failures reduce the stability,but information from vehicles that are farther ahead and the use of a larger desired time headway can improve ***,the critical ratio of communication failures required to ensure stability for different driving parameters is studied in this work.
AlphaFold2(AF2)is an artificial intelligence(AI)system developed by DeepMind that can predict three-dimensional(3D)structures of proteins from amino acid sequences with atomic-level *** structure prediction is one of ...
详细信息
AlphaFold2(AF2)is an artificial intelligence(AI)system developed by DeepMind that can predict three-dimensional(3D)structures of proteins from amino acid sequences with atomic-level *** structure prediction is one of the most challenging problems in computational biology and chemistry,and has puzzled scientists for 50 *** advent of AF2 presents an unprecedented progress in protein structure prediction and has attracted much *** release of structures of more than 200 million proteins predicted by AF2 further aroused great enthusiasm in the science community,especially in the fields of biology and ***2 is thought to have a significant impact on structural biology and research areas that need protein structure information,such as drug discovery,protein design,prediction of protein function,et *** the time is not long since AF2 was developed,there are already quite a few application studies of AF2 in the fields of biology and medicine,with many of them having preliminarily proved the potential of *** better understand AF2 and promote its applications,we will in this article summarize the principle and system architecture of AF2 as well as the recipe of its success,and particularly focus on reviewing its applications in the fields of biology and *** of current AF2 prediction will also be discussed.
For Internet forum Points of Interest(PoI),existing analysis methods are usually lack of usability analysis under different conditions and ignore the long-term variation,which lead to blindness in method *** address t...
详细信息
For Internet forum Points of Interest(PoI),existing analysis methods are usually lack of usability analysis under different conditions and ignore the long-term variation,which lead to blindness in method *** address this problem,this paper proposed a PoI variation prediction framework based on similarity analysis between long and short *** on the framework,this paper presented 5 PoI analysis algorithms which can be categorized into 2 types,i.e.,the traditional sequence analysis methods such as autoregressive integrated moving average model(ARIMA),support vector regressor(SVR),and the deep learning methods such as convolutional neural network(CNN),long-short term memory network(LSTM),Transformer(TRM).Specifically,this paper firstly divides observed data into long and short windows,and extracts key words as PoI of each ***,the PoI similarities between long and short windows are calculated for training and ***,series of experiments is conducted based on real Internet forum *** results show that,all the 5 algorithms could predict PoI variations well,which indicate effectiveness of the proposed *** the length of long window is small,traditional methods perform better,and SVR is the *** the contrary,the deep learning methods show superiority,and LSTM performs *** results could provide beneficial references for PoI variation analysis and prediction algorithms selection under different parameter configurations.
Accurately analyzing and predicting driver lane-changing intentions is of paramount importance, as it significantly enhances the safety of self-driving vehicles in their decision-making processes, holding great promis...
详细信息
Biselection (feature and sample selection) enhances the efficiency and accuracy of machine learning models when handling large-scale data. Fuzzy rough sets, an uncertainty mathematics model known for its excellent int...
详细信息
Pushing artificial intelligence(AI) from central cloud to network edge has reached board consensus in both industry and academia for materializing the vision of artificial intelligence of things(AIoT) in the sixth-gen...
详细信息
Pushing artificial intelligence(AI) from central cloud to network edge has reached board consensus in both industry and academia for materializing the vision of artificial intelligence of things(AIoT) in the sixth-generation(6G) era. This gives rise to an emerging research area known as edge intelligence, which concerns the distillation of human-like intelligence from the vast amount of data scattered at the wireless network edge. Typically, realizing edge intelligence corresponds to the processes of sensing, communication,and computation, which are coupled ingredients for data generation, exchanging, and processing, ***, conventional wireless networks design the three mentioned ingredients separately in a task-agnostic manner, which leads to difficulties in accommodating the stringent demands of ultra-low latency, ultra-high reliability, and high capacity in emerging AI applications like auto-driving and metaverse. This thus prompts a new design paradigm of seamlessly integrated sensing, communication, and computation(ISCC) in a taskoriented manner, which comprehensively accounts for the use of the data in downstream AI tasks. In view of its growing interest, this study provides a timely overview of ISCC for edge intelligence by introducing its basic concept, design challenges, and enabling techniques, surveying the state-of-the-art advancements, and shedding light on the road ahead.
Reasoning complex logical queries on incomplete and massive knowledge graphs (KGs) remains a significant challenge. The prevailing method for this problem is query embedding, which embeds KG units (i.e., entities and ...
详细信息
暂无评论