The PeRConAI workshop aims at promoting the circulation of new ideas and research directions on pervasive and resource-constrained artificial intelligence, serving as a forum for practitioners and researchers working ...
The PeRConAI workshop aims at promoting the circulation of new ideas and research directions on pervasive and resource-constrained artificial intelligence, serving as a forum for practitioners and researchers working on the intersection between pervasive computing and machine learning, including deep learning and artificial intelligence. The workshop welcomes theoretical and applied research sharing the common objective of investigating solutions advancing towards a truly pervasive and liquid AI. In the long term, we envision a future where every device at the edge of the Internet, regardless of its computing capabilities, will have an active role in the AI process by processing its data and collaborating with other devices to extract knowledge from them.
Meaning is the foundation stone of intercultural communication. Languages are continuously changing, and words shift their meanings for various reasons. Semantic divergence in related languages is a key concern of his...
详细信息
The smart grid encountered numerous issues when attempting to continually monitor participant usages, bidirectional data flow, and utility costs as a result of the rise in the use of renewable energy sources. Speciali...
The smart grid encountered numerous issues when attempting to continually monitor participant usages, bidirectional data flow, and utility costs as a result of the rise in the use of renewable energy sources. Specialists must detect system failure before it occurs and ascertain whether the smart grid is stable in order to achieve a robust, dependable smart grid. Conventionally in a smart grid, data on consumer demand is gathered and centrally evaluated against current supply conditions where consumers are then given the resulting recommended pricing information so they can decide how much to use. Because the entire traditional process is time-sensitive, dynamically evaluating grid stability becomes not only a concern but also an absolute necessity. Power supply and demand need to be balanced in order to maintain grid stability. Thus, machine learning might significantly aid in classifying whether or not the smart grid is stable by applying contemporary data mining approaches. In this research, different machine learning algorithms will be applied to assess and predict grid stability. The algorithms used in this paper are XG-classifier, Random Forest, Neural Network, KNN, SVM. The experiment results show that the XG-classifier achieved the highest performance yielding Classification accuracy, AUC, F1 and Recall of 97.79%, 96.7%, 95.6%, 95.4%, 94.5% respectively.
This paper offers a way for optimizing actual-time records analysis the usage of a variety of machine gaining knowledge of algorithms. The algorithms paintings by creating predictive fashions which extract useful info...
This paper offers a way for optimizing actual-time records analysis the usage of a variety of machine gaining knowledge of algorithms. The algorithms paintings by creating predictive fashions which extract useful information from datasets, follow appropriate preprocessing techniques to make sure the high-quality of the records, and model the information to deduce meaningful capabilities from the data. The predictive models are then used to discover applicable developments, find patterns, and summarize the data for clean analysis. specifically, this paper outlines the technique of using supervised and unsupervised learning algorithms, consisting of Random wooded area, Naïve Bayes, and k-approach clustering, to extract meaningful styles and relationships from facts points in real-time. Additionally, the paper discusses wonderful capacity packages of the proposed technique, including fraud detection and prediction of stock costs. Such capability programs demonstrate the fee of actual-time records analysis for organizations and organizations.
Smart gadgets can now communicate from close to a long distance with one another and with the Internet or cloud. Internet of things (IOT) brings a paradigm shift of employing low resource IOT smart system with cloud c...
Smart gadgets can now communicate from close to a long distance with one another and with the Internet or cloud. Internet of things (IOT) brings a paradigm shift of employing low resource IOT smart system with cloud computing. However, by employing cloud computing, resource-constrained IoT smart devices can gain a number of advantages, Excluding the weight of dataprocessing and storing the data on the network cloud. By implementing it on network edge offers more merits instead of using network cloud in contra to internet of things (IOT) applications which needs high data rates, mobility, and latency-sensitive real-time dataprocessing. In this paper mainly focused on data transfers to cloud and IOT devices form smart data transfer. Here a suggestion that is authenticated search method to look for required information among one's personal or shared data on storage. At last, by evaluating processing time performance of the suggested scheme, outcomes that discussed in the paper, show that our strategy has a chance of working well in IoT applications.
Nanobodies, also known as single-domain or VHH antibodies, are recombinant variable domains derived from heavy-chain-only antibodies. They exhibit desirable characteristics, including small size, high solubility, exce...
Nanobodies, also known as single-domain or VHH antibodies, are recombinant variable domains derived from heavy-chain-only antibodies. They exhibit desirable characteristics, including small size, high solubility, exceptional stability, rapid blood clearance, and deep tissue penetration, rendering them valuable tools for disease diagnosis and treatment. In recent years, several deep-learning-based methods for protein structure prediction have been developed, requiring only protein sequences as input. Notable examples include AlphaFold2, RoseTTAFold, DeepAb, NanoNet, and tFold, which have demonstrated remarkable performance in protein or antibody/nanobody prediction. In this study, we analyzed 60 nanobody samples with known experimental 3D structures from the Protein data Bank (PDB). The accuracy of these algorithms was assessed using two metrics: RMSD and TM-score. Our findings revealed that NanoNet and tFold, particularly NanoNet, exhibit outstanding performance.
This contribution presents an overview of engineering techniques to collect human behavior data, highlighting the main research trends and challenges. From our research it emerged that wearable and smartphone sensors ...
This contribution presents an overview of engineering techniques to collect human behavior data, highlighting the main research trends and challenges. From our research it emerged that wearable and smartphone sensors are popular for monitoring movement and vital signs, although accuracy can be influenced by environmental factors. Combining multiple sensors improves data accuracy, while machine learning algorithms enable pattern detection and behavior analysis. Non-invasive techniques, such as video monitoring and speech analysis, offer a comprehensive view of behavior. Despite all the positive advancements, there are still remaining challenges that require further research, including the need to enhance sensor accuracy, develop sensor fusion methods, and refine machine learning algorithms for improved data analysis in human activity monitoring.
With the development of mobile Internet technology, the technology in the field of medical image processing is constantly updated and iterated. Digital watermarking technology plays an important role in the field of m...
详细信息
With the increasing growth of free and open data from the new generation of Earth-Observation (EO) satellites, Earth Observation has entered the Big data era, and the traditional data management methods face many chal...
With the increasing growth of free and open data from the new generation of Earth-Observation (EO) satellites, Earth Observation has entered the Big data era, and the traditional data management methods face many challenges to keep up with the growth rate of data. The Open data Cube (ODC) is an open-source geospatial data management and analysis tool designed to enable users to process, analyze, and visualize large amounts of satellite imagery and climate data. However, most ODCs are applied independently in current, and due to difficulties with data transfer, and the managing domain restrictions on algorithms, it’s not easy to communicate with each other. This article proposed the data Hub architecture for multiple ODC interconnection. data Hub enables data interconnection and algorithm inter-operation between ODCs and supports the organization of workflow to build Remote Sensing applications based on multiple ODCs. To demonstrate the working mechanism of multiple ODC interconnection, we organized an application that performed the NDVI algorithm on Shennongjia data with the architecture above.
In today's realm of machine learning, the non-linearity of data is the problem often faced during data analysis. A well-known supervised learning algorithm that is known to efficiently handle this problem is the S...
详细信息
In today's realm of machine learning, the non-linearity of data is the problem often faced during data analysis. A well-known supervised learning algorithm that is known to efficiently handle this problem is the Support Vector Machine. The trick deployed in the background to overcome the problem of non-linearity is the transformation of data from the existing dimension to a higher dimension. To make this conversion of dimensions possible, kernel functions are utilized. These kernel functions are simply mathematical functions, which are available in the literature for a long ago. The present study briefly describes the mathematical background of the kernel functions with examples and a quick summarization of the recent applications of the renowned kernel functions across various domains. Further, the numerous benefits, a few limitations of kernel functions, and future directions are also highlighted in the work.
暂无评论