One of the drastically growing and emerging research areas used in most informationtechnology industries is Bigdata *** is created from social websites like Facebook,WhatsApp,Twitter,*** about products,persons,initia...
详细信息
One of the drastically growing and emerging research areas used in most informationtechnology industries is Bigdata *** is created from social websites like Facebook,WhatsApp,Twitter,*** about products,persons,initiatives,political issues,research achievements,and entertainment are discussed on social *** unique data analytics method cannot be applied to various social websites since the data formats are *** approaches,techniques,and tools have been used for big data analytics,opinion mining,or sentiment analysis,but the accuracy is yet to be *** proposed work is motivated to do sentiment analysis on Twitter data for cloth products using Simulated Annealing incorporated with the Multiclass Support Vector Machine(SA-MSVM)***-MSVM is a hybrid heuristic approach for selecting and classifying text-based sentimental words following the Natural Language Processing(NLP)process applied on tweets extracted from the Twitter dataset.A simulated annealing algorithm searches for relevant features and selects and identifies sentimental terms that customers ***-MSVM is implemented,experimented with MATLAB,and the results are *** results concluded that SA-MSVM has more potential in sentiment analysis and classification than the existing Support Vector Machine(SVM)***-MSVM has obtained 96.34%accuracy in classifying the product review compared with the existing systems.
Human beings are often affected by a wide range of skin diseases,which can be attributed to genetic factors and environmental influences,such as exposure to sunshine with ultraviolet(UV)*** left untreated,these diseas...
详细信息
Human beings are often affected by a wide range of skin diseases,which can be attributed to genetic factors and environmental influences,such as exposure to sunshine with ultraviolet(UV)*** left untreated,these diseases can have severe consequences and spread,especially among *** detection is crucial to prevent their spread and improve a patient’s chances of ***,the branch of medicine dealing with skin diseases,faces challenges in accurately diagnosing these conditions due to the difficulty in identifying and distinguishing between different diseases based on their appearance,type of skin,and *** study presents a method for detecting skin diseases using Deep Learning(DL),focusing on the most common diseases affecting children in Saudi Arabia due to the high UV value in most of the year,especially in the *** method utilizes various Convolutional Neural Network(CNN)architectures to classify skin conditions such as eczema,psoriasis,and *** proposed method demonstrates high accuracy rates of 99.99%and 97%using famous and effective transfer learning models MobileNet and DenseNet121,*** illustrates the potential of DL in automating the detection of skin diseases and offers a promising approach for early diagnosis and treatment.
In recent years, numerous CNN-based light field (LF) image super-resolution (SR) methods have been developed. However, due to the downsampling inconsistency between low-resolution (LR) testing LF images and LR trainin...
详细信息
Phishing attacks have risen like wildfire in the era of new digital technology in the wake of exploiting holes in the electronic communication process while cheating a user to seek private information. This work provi...
详细信息
Existing deep learning-based point cloud denoising methods are generally trained in a supervised manner that requires clean data as ground-truth ***,in practice,it is not always feasible to obtain clean point *** this...
详细信息
Existing deep learning-based point cloud denoising methods are generally trained in a supervised manner that requires clean data as ground-truth ***,in practice,it is not always feasible to obtain clean point *** this paper,we introduce a novel unsupervised point cloud denoising method that eliminates the need to use clean point clouds as groundtruth labels during *** demonstrate that it is feasible for neural networks to only take noisy point clouds as input,and learn to approximate and restore their clean *** particular,we generate two noise levels for the original point clouds,requiring the second noise level to be twice the amount of the first noise *** this,we can deduce the relationship between the displacement information that recovers the clean surfaces across the two levels of noise,and thus learn the displacement of each noisy point in order to recover the corresponding clean *** experiments demonstrate that our method achieves outstanding denoising results across various datasets with synthetic and real-world noise,obtaining better performance than previous unsupervised methods and competitive performance to current supervised methods.
Blockchain and the Internet of Things (IoT), two of the most emerging technologies, are already reconfiguring our digital future, as described by the drastic change in the current network architecture. The incorporati...
详细信息
Today's deep learning models face an increasing demand to handle dynamic shape tensors and computation whose shape information remains unknown at compile time and varies in a nearly infinite range at runtime. This...
详细信息
Today's deep learning models face an increasing demand to handle dynamic shape tensors and computation whose shape information remains unknown at compile time and varies in a nearly infinite range at runtime. This shape dynamism brings tremendous challenges for existing compilation pipelines designed for static models which optimize tensor programs relying on exact shape values. This paper presents TSCompiler, an end-to-end compilation framework for dynamic shape models. TSCompiler first proposes a symbolic shape propagation algorithm to recover symbolic shape information at compile time to enable subsequent optimizations. TSCompiler then partitions the shape-annotated computation graph into multiple subgraphs and fine-tunes the backbone operators from the subgraph within a hardware-aligned search space to find a collection of high-performance schedules. TSCompiler can propagate the explored backbone schedule to other fusion groups within the same subgraph to generate a set of parameterized tensor programs for fused cases based on dependence analysis. At runtime, TSCompiler utilizes an occupancy-targeted cost model to select from pre-compiled tensor programs for varied tensor shapes. Extensive evaluations show that TSCompiler can achieve state-of-the-art speedups for dynamic shape models. For example, we can improve kernel efficiency by up to 3.97× on NVIDIA RTX3090, and 10.30× on NVIDIA A100 and achieve up to five orders of magnitude speedups on end-to-end latency.
In linguistics, all languages can be considered as symbolic systems, with each language relying on symbolic processes to associate specific symbols with meanings. In the same language, there is a fixed correspondence ...
详细信息
Online reviews play an integral part in making mobile applications stand out from the large number of applications available on the Google Play store. Predominantly, users consider posted reviews for appropriate app s...
详细信息
Blockchain can realize the reliable storage of a large amount of data that is chronologically related and verifiable within the *** technology has been widely used and has developed rapidly in big data systems across ...
详细信息
Blockchain can realize the reliable storage of a large amount of data that is chronologically related and verifiable within the *** technology has been widely used and has developed rapidly in big data systems across various *** increasing number of users are participating in application systems that use blockchain as their underlying *** the number of transactions and the capital involved in blockchain grow,ensuring information security becomes *** the verification of transactional information security and privacy has emerged as a critical ***-based verification methods can effectively eliminate the need for centralized third-party ***,the efficiency of nodes in storing and verifying blockchain data faces unprecedented *** address this issue,this paper introduces an efficient verification scheme for transaction ***,it presents a node evaluation module to estimate the activity level of user nodes participating in transactions,accompanied by a probabilistic analysis for all ***,this paper optimizes the conventional transaction organization form,introduces a heterogeneous Merkle tree storage structure,and designs algorithms for constructing these heterogeneous *** analyses and simulation experiments conclusively demonstrate the superior performance of this *** verifying the same number of transactions,the heterogeneous Merkle tree transmits less data and is more efficient than traditional *** findings indicate that the heterogeneous Merkle tree structure is suitable for various blockchain applications,including the Internet of *** scheme can markedly enhance the efficiency of information verification and bolster the security of distributed systems.
暂无评论