The proliferation of fake news and its detrimental impact has spurred significant research in fake news detection. Existing studies have primarily focused on classifying news content in a broader definition, for examp...
The proliferation of fake news and its detrimental impact has spurred significant research in fake news detection. Existing studies have primarily focused on classifying news content in a broader definition, for example, as real or fake or likely or unlikely, utilizing machine learning algorithms and natural language processing techniques. However, this classification approach must account for the diverse nature of fake news, which can vary in degrees of falseness. Moreover, relying on textual features alone overlooks the multimedia elements often involved in fake news dissemination, such as images and videos. The emphasis on individual instances of fake news also neglects the broader dynamics of information diffusion and contextual factors. To address these limitations, we propose adopting the problem understanding phase of data mining processes for formulating the fake news detection problem. This phase involves a comprehensive understanding of the characteristics of fake news, the credibility indicators across various modalities, and the social dynamics that shape its spread. This paper provides a literature background, explores the problem of understanding fake news detection, discusses the challenges in achieving optimal detection solutions, and highlights potential research opportunities for future work in this critical area.
The world revolves around data. Hence the importance of storing data becomes highly essential, but in order to store the data as such requires a lot of storage space which brings up the role of data Compression. It be...
The world revolves around data. Hence the importance of storing data becomes highly essential, but in order to store the data as such requires a lot of storage space which brings up the role of data Compression. It becomes necessary to compress the data before storing it. Compressing the data without losing any information is highly important. data Compression can be lossy or lossless. It is always essential to follow lossless compression for textual data. The project aims to compress textual data using two lossless compression techniques such as Canonical Huffman Encoding and Golomb Rice Encoding [1]. 64-bit input data is compressed using both algorithms in Verilog HDL and analyzed which algorithm encodes the textual data with the least number of bits. Efficiency of the algorithms are analyzed using parameters such as Compression Ratio, Area, Power Consumption and Delay. The better algorithm between the two is found and is implemented in FPGA. In order to reduce power consumption, clock gating is added to the better design. data compression reduces the hardware cost drastically [2].
In order to solve the problem that the existing detection mechanism only using the fall sensors easily leads to the elderly care system receiving false fall help signals, an intelligent decision-making mechanism based...
In order to solve the problem that the existing detection mechanism only using the fall sensors easily leads to the elderly care system receiving false fall help signals, an intelligent decision-making mechanism based on a fuzzy neural network is proposed. After receiving the fall help signal reported by the person label, the mechanism first reads the sensor data collected by the person label at that moment, then standardizes the sensor data, and finally inputs the standardized sensor data into the fuzzy neural network model to analyze and judge the authenticity of the fall help signal. The results show that the intelligent decision-making mechanism based on the fuzzy neural network can effectively eliminate the false fall help-seeking signal caused by the false judgment of the fall sensor, and the judgment accuracy of the fall help-seeking signal can reach 97. 5%.
The recognition of bird calls has been a challenging task in the field of bioacoustics. With the advancement of machine learning algorithms, automatic bird call recognition has become an active research area. In this ...
The recognition of bird calls has been a challenging task in the field of bioacoustics. With the advancement of machine learning algorithms, automatic bird call recognition has become an active research area. In this paper, acoustic feature selection is used which involves the extraction of relevant features from audio recordings of bird calls. Then different classification algorithms are explored for the recognition of the bird calls. Evaluation of the approach is done on a dataset consisting of recordings from multiple bird species and compared it with other state-of-the-art machine learning algorithms. This research has the potential to contribute to the conservation of bird species and their habitats by enabling the efficient monitoring of bird populations. The data is classified into four different classes (namely Astfly, Bulori, Warvir and Woothr). All of which is found in the depths of Amazon Rainforest.
Distributed optical fiber sensing technology can achieve distributed measurement of temperature and strain at any position of buried pipelines, but the post-dataprocessing process is quite cumbersome. In order to con...
Distributed optical fiber sensing technology can achieve distributed measurement of temperature and strain at any position of buried pipelines, but the post-dataprocessing process is quite cumbersome. In order to conduct more efficient and reasonable analysis for pipeline structure status monitoring data, a buried pipeline structural state monitoring and data analysis system based on Matlab GUI has been developed, which realizes an intelligent and visual operating environment for the monitoring dataprocessing process. The actual application of a gas pipeline in a certain area shows that the system can effectively analyze and process monitoring data, thereby evaluating the structural state of the pipeline and greatly improving the analysis efficiency of monitoring data, providing technical support for the processing and analysis of pipeline structural state monitoring data.
By putting digital technology and vast volume of data together, smart city becomes an emerging city paradigm for intelligent city management and operation. As one of the most popular artificial intelligent algorithms,...
详细信息
This paper shows the distinction between Quantum and Classical Machine Learning techniques appeal to a diabetic mellitus dataset. Diabetes mellitus is like series of diseases that affect the body and deplete blood sug...
详细信息
This paper shows the distinction between Quantum and Classical Machine Learning techniques appeal to a diabetic mellitus dataset. Diabetes mellitus is like series of diseases that affect the body and deplete blood sugar levels (insulin). In our bodies, glucose is an important source of energy for powering mitochondria, the cells' function that make muscles and tissues strong. We use many machine learning classifiers such as Support Vector Machine, Kernel Principal Component Analysis, Bayesian Network and Decision Tree etc. These models are able predict a certain amount of data, while in case of large data there is significant amount of error and low accuracy rate, A new method known as quantum machine learning (QML) is nothing but the collaboration of machine learning in the way of quantum algorithms the term used for machine learning algorithms executed in a quantum computer for analysis of classical data is known as quantum-enhanced machine learning. In this method we are predicting diabetes mellitus using quantum machine learning algorithms which able to handle the huge data with high accuracy rate of 97% and F1-Score of 68%.Our study having implemented models of Predicted & Enhanced models.
Advancement of technologies in computing such as internet of things, cloud computing, and artificial intelligence drive manufacturing industries to adopt and implement automation in production. One of the key technolo...
Advancement of technologies in computing such as internet of things, cloud computing, and artificial intelligence drive manufacturing industries to adopt and implement automation in production. One of the key technologies or preferable methods to increase the productivity is implementing prediction models or machine learning (ML) algorithms in production. This article is aimed to show a comprehensive review on AI implementation in machining of materials, and to present methodology in prediction model development. The characteristic of experimental data and the key attributes in the model development are presented and discussed with a case study.
In view of the high-speed data transmission requirements of the new generation of power line carrier communication in distributed photovoltaic information access, data compression sensing is required to improve. In th...
In view of the high-speed data transmission requirements of the new generation of power line carrier communication in distributed photovoltaic information access, data compression sensing is required to improve. In this paper, we study an AIC-based distributed photovoltaic data compression sensing method. Based on the adaptive characteristics of K-singular value decomposition (K-SVD) algorithm, the data sparsity is dynamically estimated to realize the adaptive dynamic adjustment of data acquisition. AIC algorithm is used to reconstruct data, which greatly improves the accuracy of data reconstruction. Information aggregation based on compressed sensing is constructed to achieve high efficiency data aggregation under the condition of low sampling rate. Simulation results show the superior performance of the proposed algorithm in the data transmission volume and energy consumption.
This paper studies a dataprocessing method, system, device and computer readable storage medium. The method includes: user terminal sends query request to data storage blockchain; the data storage blockchain querying...
This paper studies a dataprocessing method, system, device and computer readable storage medium. The method includes: user terminal sends query request to data storage blockchain; the data storage blockchain querying the target information corresponding to the request based on the smart contract; the user terminal displaying the target information and sending the transaction request to the corresponding merchant terminal based on the smart contract. The merchant terminal executes the transaction operation corresponding to the transaction request. And when the transaction operation is completed, it executes the point issuance operation for the account information corresponding to the transaction request based on incentive points. This paper can realize the transaction operation in the e-commerce platform system through the blockchain technology, and store commodity data (target information) through the data storage blockchain. It also ensures the security of various data in the e-commerce platform system, avoid the leakage of private data in the e-commerce platform system, and improve the data credibility and data security of the e-commerce platform system.
暂无评论