Due to the inherent insecure nature of the Internet,it is crucial to ensure the secure transmission of image data over this ***,given the limitations of computers,it becomes evenmore important to employ efficient and ...
详细信息
Due to the inherent insecure nature of the Internet,it is crucial to ensure the secure transmission of image data over this ***,given the limitations of computers,it becomes evenmore important to employ efficient and fast image encryption *** 1D chaotic maps offer a practical approach to real-time image encryption,their limited flexibility and increased vulnerability restrict their practical *** this research,we have utilized a 3DHindmarsh-Rosemodel to construct a secure *** randomness of the chaotic map is assessed through standard *** proposed system enhances security by incorporating an increased number of system parameters and a wide range of chaotic parameters,as well as ensuring a uniformdistribution of chaotic signals across the entire value ***,a fast image encryption technique utilizing the new chaotic system is *** novelty of the approach is confirmed through time complexity *** further strengthen the resistance against cryptanalysis attacks and differential attacks,the SHA-256 algorithm is employed for secure key *** results through a number of parameters demonstrate the strong cryptographic performance of the proposed image encryption approach,highlighting its exceptional suitability for secure ***,the security of the proposed scheme has been compared with stateof-the-art image encryption schemes,and all comparison metrics indicate the superior performance of the proposed scheme.
The ability to capture fine spectral discriminative information enables hyperspectral images(HSIs) to observe, detect and identify objects with subtle spectral discrepancy. However, the captured HSIs may not represent...
详细信息
The ability to capture fine spectral discriminative information enables hyperspectral images(HSIs) to observe, detect and identify objects with subtle spectral discrepancy. However, the captured HSIs may not represent the true distribution of ground objects and the received reflectance at imaging instruments may be degraded, owing to environmental disturbances, atmospheric effects, and sensors' hardware limitations. These degradations include but are not limited to complex noise, heavy stripes, deadlines,cloud/shadow occlusion, blurring and spatial-resolution degradation, etc. These degradations dramatically reduce the quality and usefulness of HSIs. Low-rank tensor approximation(LRTA) is such an emerging technique, having gained much attention in the HSI restoration community, with an ever-growing theoretical foundation and pivotal technological innovation. Compared to low-rank matrix approximation(LRMA),LRTA characterizes more complex intrinsic structures of high-order data and owns more efficient learning abilities, being established to address convex and non-convex inverse optimization problems induced by HSI restoration. This survey mainly attempts to present a sophisticated, cutting-edge, and comprehensive technical survey of LRTA toward HSI restoration, specifically focusing on the following six topics: denoising, fusion,destriping, inpainting, deblurring, and super-resolution. For each topic, state-of-the-art restoration methods are introduced, with quantitative and visual performance assessments. Open issues and challenges are also presented, including model formulation, algorithm design, prior exploration, and application concerning the interpretation requirements.
Advanced Encryption Standard (AES) is one of the most widely used symmetric cipher for the confidentiality of data. Also it is used for other security services, viz. integrity, authentication and key establishment. Ho...
详细信息
People with mobility limitations and impairments frequently receive repetitive remedial physiotherapy sessions to reduce functional deficits in the afflicted area. Even though therapy processes can enhance functional ...
详细信息
Metaverse has rekindled human beings’desire to further break space-time barriers by fusing the virtual and real ***,security and privacy threats hinder us from building a utopia.A metaverse em-braces various techniqu...
详细信息
Metaverse has rekindled human beings’desire to further break space-time barriers by fusing the virtual and real ***,security and privacy threats hinder us from building a utopia.A metaverse em-braces various techniques,while at the same time inheriting their pitfalls and thus exposing large attack ***,proposed in 2008,was regarded as a key building block of *** enables transparent and trusted computing environments using tamper-resistant decentralized ***,blockchain supports Decentralized Finance(DeFi)and Non-fungible Tokens(NFT)for ***-ever,the power of a blockchain has not been sufficiently *** this article,we propose a novel trustless architecture of blockchain-enabled metaverse,aiming to provide efficient resource integration and allocation by consolidating hardware and software *** realize our design objectives,we provide an On-Demand Trusted Computing Environment(OTCE)technique based on local trust ***,the architecture adopts a hypergraph to represent a metaverse,in which each hyper-edge links a group of users with certain *** the trust level of each user group can be evaluated based on graph analytics *** on the trust value,each group can determine its security plan on demand,free from interference by irrelevant ***,OTCEs enable large-scale and flexible application environments(sandboxes)while preserving a strong security guarantee.
Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabe...
详细信息
Domain adaptation(DA) aims to find a subspace,where the discrepancies between the source and target domains are reduced. Based on this subspace, the classifier trained by the labeled source samples can classify unlabeled target samples *** approaches leverage Graph Embedding Learning to explore such a subspace. Unfortunately, due to 1) the interaction of the consistency and specificity between samples, and 2) the joint impact of the degenerated features and incorrect labels in the samples, the existing approaches might assign unsuitable similarity, which restricts their performance. In this paper, we propose an approach called adaptive graph embedding with consistency and specificity(AGE-CS) to cope with these issues. AGE-CS consists of two methods, i.e., graph embedding with consistency and specificity(GECS), and adaptive graph embedding(AGE).GECS jointly learns the similarity of samples under the geometric distance and semantic similarity metrics, while AGE adaptively adjusts the relative importance between the geometric distance and semantic similarity during the iterations. By AGE-CS,the neighborhood samples with the same label are rewarded,while the neighborhood samples with different labels are punished. As a result, compact structures are preserved, and advanced performance is achieved. Extensive experiments on five benchmark datasets demonstrate that the proposed method performs better than other Graph Embedding methods.
In a number of industries, including computer graphics, robotics, and medical imaging, three-dimensional reconstruction is essential. In this research, a CNN-based Multi-output and Multi-Task Regressor with deep learn...
详细信息
In most of the scientific research feature selection is a challenge for *** all available features is not an option as it usually complicates the research and leads to performance drop when dealing with large *** the ...
详细信息
In most of the scientific research feature selection is a challenge for *** all available features is not an option as it usually complicates the research and leads to performance drop when dealing with large *** the other hand,ignoring some features can compromise the data *** the rough set theory presents a good technique to identify the redundant features which can be dismissed without losing any valuable information,however,exploring all possible combinations of features will end with NP-hard *** this research we propose adopting a heuristic algorithm to solve this problem,Polar Bear Optimization PBO is a metaheuristic algorithm provides an effective technique for solving such kind of optimization *** other heuristic algorithms it proposes a dynamic mechanism for birth and death which allows keep investing in promising solutions and keep dismissing hopeless *** evaluate its efficiency,we applied our proposed model on several datasets and measured the quality of the obtained minimal feature set to prove that redundant data was removed without data loss.
Multimedia systems and metaverse has been gaining increasing interest for education in virtual environments. With wide adoption of these technologies, the data is expected to grow exponentially. The increasing rise of...
详细信息
Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by ***,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global sampling but...
详细信息
Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by ***,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global sampling but neglect to exploit global and local sampling simultaneously;ⅱ)they either transfer knowledge from a global perspective or a local perspective,while overlooking transmission of confident knowledge from both perspectives;and ⅲ) they apply repeated sampling during iteration,which takes a lot of *** address these problems,knowledge transfer learning via dual density sampling(KTL-DDS) is proposed in this study,which consists of three parts:ⅰ) Dual density sampling(DDS) that jointly leverages two sampling methods associated with different views,i.e.,global density sampling that extracts representative samples with the most common features and local density sampling that selects representative samples with critical boundary information;ⅱ)Consistent maximum mean discrepancy(CMMD) that reduces intra-and cross-domain risks and guarantees high consistency of knowledge by shortening the distances of every two subsets among the four subsets collected by DDS;and ⅲ) Knowledge dissemination(KD) that transmits confident and consistent knowledge from the representative target samples with global and local properties to the whole target domain by preserving the neighboring relationships of the target *** analyses show that DDS avoids repeated sampling during the *** the above three actions,confident knowledge with both global and local properties is transferred,and the memory and running time are greatly *** addition,a general framework named dual density sampling approximation(DDSA) is extended,which can be easily applied to other DA *** experiments on five datasets in clean,label corruption(LC),feature missing(FM),and LC&FM environments demonstrate the encouraging performance of KTL-DDS.
暂无评论