Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabe...
详细信息
Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabeled target samples *** approaches leverage Graph Embedding Learning to explore such a subspace. Unfortunately, due to 1) the interaction of the consistency and specificity between samples, and 2) the joint impact of the degenerated features and incorrect labels in the samples, the existing approaches might assign unsuitable similarity, which restricts their performance. In this paper, we propose an approach called adaptive graph embedding with consistency and specificity(AGE-CS) to cope with these issues. AGE-CS consists of two methods, i.e., graph embedding with consistency and specificity(GECS), and adaptive graph embedding(AGE).GECS jointly learns the similarity of samples under the geometric distance and semantic similarity metrics, while AGE adaptively adjusts the relative importance between the geometric distance and semantic similarity during the iterations. By AGE-CS,the neighborhood samples with the same label are rewarded,while the neighborhood samples with different labels are punished. As a result, compact structures are preserved, and advanced performance is achieved. Extensive experiments on five benchmark datasets demonstrate that the proposed method performs better than other Graph Embedding methods.
Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by ***,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global sampling but...
详细信息
Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by ***,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global sampling but neglect to exploit global and local sampling simultaneously;ⅱ)they either transfer knowledge from a global perspective or a local perspective,while overlooking transmission of confident knowledge from both perspectives;and ⅲ) they apply repeated sampling during iteration,which takes a lot of *** address these problems,knowledge transfer learning via dual density sampling(KTL-DDS) is proposed in this study,which consists of three parts:ⅰ) Dual density sampling(DDS) that jointly leverages two sampling methods associated with different views,i.e.,global density sampling that extracts representative samples with the most common features and local density sampling that selects representative samples with critical boundary information;ⅱ)Consistent maximum mean discrepancy(CMMD) that reduces intra-and cross-domain risks and guarantees high consistency of knowledge by shortening the distances of every two subsets among the four subsets collected by DDS;and ⅲ) Knowledge dissemination(KD) that transmits confident and consistent knowledge from the representative target samples with global and local properties to the whole target domain by preserving the neighboring relationships of the target *** analyses show that DDS avoids repeated sampling during the *** the above three actions,confident knowledge with both global and local properties is transferred,and the memory and running time are greatly *** addition,a general framework named dual density sampling approximation(DDSA) is extended,which can be easily applied to other DA *** experiments on five datasets in clean,label corruption(LC),feature missing(FM),and LC&FM environments demonstrate the encouraging performance of KTL-DDS.
This paper presents a data-driven variable reduction approach to accelerate the computation of large-scale transmission-constrained unit commitment(TCUC).Lagrangian relaxation(LR)and mixed-integer linear programming(M...
详细信息
This paper presents a data-driven variable reduction approach to accelerate the computation of large-scale transmission-constrained unit commitment(TCUC).Lagrangian relaxation(LR)and mixed-integer linear programming(MILP)are popular approaches to solving ***,with many binary unit commitment variables,LR suffers from slow convergence and MILP presents heavy computation *** proposed data-driven variable reduction approach consists of offline and online calculations to accelerate computational performance of the MILP-based large-scale TCUC problems.A database including multiple nodal net load intervals and the corresponding TCUC solutions is first built offline via the data-driven and all-scenario-feasible(ASF)approaches,which is then leveraged to efficiently solve new TCUC instances ***/off statuses of considerable units can be fixed in the online calculation according to the database,which would reduce the computation burden while guaranteeing good solution quality for new TCUC instances.A feasibility proposition is proposed to promptly check the feasibility of the new TCUC instances with fixed binary variables,which can be used to dynamically tune parameters of binary variable fixing strategies and guarantee the existence of feasible UC solutions even when system structure *** tests illustrate the efficiency of the proposed approach.
Personalized recommender systems are becoming more popular to reduce the issue of information overload. It is also observed that the recommendations provided by multi-criteria recommender system (MCRS) are more accura...
详细信息
Lightweight video representation techniques have advanced significantly for simple activity recognition, but they still encounter several issues when applied to complex activity recognition: (i) The presence of numero...
详细信息
With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of *** major concerns in this domain are security and ***,attaining a reliable level of confidentiality in...
详细信息
With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of *** major concerns in this domain are security and ***,attaining a reliable level of confidentiality in the fog computing environment is a pivotal *** different types of data stored in the fog,the 3D point and mesh fog data are increasingly popular in recent days,due to the growth of 3D modelling and 3D printing ***,in this research,we propose a novel scheme for preserving the privacy of 3D point and mesh fog *** Cat mapbased data encryption is a recently trending research area due to its unique properties like pseudo-randomness,deterministic nature,sensitivity to initial conditions,ergodicity,*** boost encryption efficiency significantly,in this work,we propose a novel Chaotic Cat *** sequence generated by this map is used to transform the coordinates of the fog *** improved range of the proposed map is depicted using bifurcation *** quality of the proposed Chaotic Cat map is also analyzed using metrics like Lyapunov exponent and approximate *** also demonstrate the performance of the proposed encryption framework using attacks like brute-force attack and statistical *** experimental results clearly depict that the proposed framework produces the best results compared to the previous works in the literature.
Pesticides have become more necessary in modern agricultural ***,these pesticides have an unforeseeable long-term impact on people's wellbeing as well as the *** to a shortage of basic pesticide exposure awareness...
详细信息
Pesticides have become more necessary in modern agricultural ***,these pesticides have an unforeseeable long-term impact on people's wellbeing as well as the *** to a shortage of basic pesticide exposure awareness,farmers typically utilize pesticides extremely close to *** residues within foods,particularly fruits as well as veggies,are a significant issue among farmers,merchants,and particularly *** residual concentrations were far lower than these maximal allowable limits,with only a few surpassing the restrictions for such pesticides in *** is an obligation to provide a warning about this amount of pesticide use in *** technologies failed to forecast the large number of pesticides that were dangerous to people,necessitating the development of improved detection and early warning systems.A novel methodology for verifying the status and evaluating the level of pesticides in regularly consumed veggies as well as fruits has been identified,named as the Hybrid Chronic Multi-Residual Framework(HCMF),in which the harmful level of used pesticide residues has been predicted for contamination in agro products using Q-Learning based Recurrent Neural Network and the predicted contamination levels have been analyzed using Complex Event Processing(CEP)by processing given spatial and sequential *** analysis results are used to minimize and effectively use pesticides in the agricultural field and also ensure the safety of farmers and ***,the technique is carried out in a Python environment,with the results showing that the proposed model has a 98.57%accuracy and a training loss of 0.30.
The convolution layer in a convolutional neural network (CNN) is highly computationally intensive. It is crucial to design reusable low-cost hardware IP for convolutional layer for enabling hardware-based feature extr...
详细信息
In recent years, deep learning and machine learning methods have been extensively employed in almost every field due to their capability of data processing and analysis. These are the subdomains of Artificial intellig...
详细信息
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...
详细信息
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of *** technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the *** the traditional blockchain,data is stored in a Merkle *** data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based ***,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of *** solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC ***,this paper uses PVC instead of the Merkle tree to store big data generated by *** can improve the efficiency of traditional VC in the process of commitment and ***,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of *** mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.
暂无评论