The ANN is a commonly used network for pattern recognition, and has been trained for various tasks such as prediction, classification, and engineering. However, this model faces challenges such as local minima and slo...
详细信息
The ANN is a commonly used network for pattern recognition, and has been trained for various tasks such as prediction, classification, and engineering. However, this model faces challenges such as local minima and slow convergence, which have been addressed through different strategies such as combining the artificial neural network (ANN) with optimised models like the cuckoo search (CS) algorithm. However, for large datasets, the hybrid ANN-based CS algorithm can lead to overfitting. To overcome this issue, the authors propose a new algorithm called Principal Component Analysis with Cuckoo Search Neural Network (PCACSNN). The performance of this algorithm is compared to other commonly used algorithms such as ANN, backpropagation neural network (BPNN), and cuckoo search backpropagation (CSBP), using the Mean Square Error (MSE) and accuracy on classification problems. The simulations were performed on the Student Performance dataset taken from the UCIMLR. The results show that the proposed model performs better than the other models, achieving high accuracy and low MSE for both mathematics and Portuguese student datasets. For the mathematics students, the suggested model attained an accuracy of (99.32%) with MSE of 2.77E-07 for 70% training data and an accuracy of 98.52% with MSE of 2.50E-04 for 30% training data. Similarly, for the Portuguese student dataset, the proposed model obtained (99.38%) accuracy with MSE of 1.09E-08 for 70% training data and 98.72% accuracy with MSE of 1.01E-04 for 30% training data.
Digital elevation model (DEM) is a critical data source for variety of applications such as road extraction, hydrological modeling, flood mapping, and many geospatial studies. The usage of high-resolution DEMs as inpu...
详细信息
Nowadays screen-shooting resilient watermarking still remains as a predictive and challenging area of research for proactive data protection in consumer electronic applications. Present deep learning-based methodologi...
详细信息
Named Data Networking (NDN) is a newly developing networking method that focuses on information, compared to TCP/IP, which focuses on hosts. This study proposed a solution to the growing complexity of computer network...
Named Data Networking (NDN) is a newly developing networking method that focuses on information, compared to TCP/IP, which focuses on hosts. This study proposed a solution to the growing complexity of computer networks. The fundamental problem will still be able to be found, one of them being difficulty in accessing a node as data traffic is piling up. One of the solutions is load balancing, which is implementing a system that will redirect packets to different service providers to ease up the load and reduce the number of packets that are dropped due to timeouts. This paper analyses the impact of load balancing in an extreme case emulated environment using Mini-NDN and random load balancing strategy. From the results gathered, implementing random load balancing to the consumer nodes decreases-and in some cases eliminates-dropped packets. In addition, implementing load balancing stabilizes packets' RTT (Round Trip Time) but increases the average RTT compared to using best route forwarding in an unloaded environment.
This paper discusses about researching an automated robot car using artificial intelligence;training its neural network using AlexNet model, using YOLO (you only look once algorithm) for object detection phase and for...
详细信息
As the Internet of Things (IoT) landscape grows, with estimates exceeding 75 billion devices by 2025, effective data management and processing become primary challenges. Traditional cloud-centric models may struggle u...
As the Internet of Things (IoT) landscape grows, with estimates exceeding 75 billion devices by 2025, effective data management and processing become primary challenges. Traditional cloud-centric models may struggle under this large data volume. This research presents Edge AI as an innovative solution, integrating artificial intelligence directly at data sources like sensors and cameras. This ensures real-time analytics and decision-making, promoting responsive and tailored actions. Our literature review details Edge AI's distinct characteristics and applications. The interaction of Edge AI with large-scale IoT domains is critically examined, emphasizing their combined potential. Within big data infrastructures, a comparative study contrasts Edge AI and cloud-based AI, investigating processing speeds, optimization techniques, and essential metrics. The in-herent limitations of Edge AI and current challenges are also discussed. In summary, Edge AI offers notable improvements in operational efficiency, data privacy, and bandwidth use. As IoT continues its rapid expansion, the strategic deployment of Edge AI becomes crucial, leading to a future where data is not just collected but smartly utilized.
Information technology and the Internet have progressed rapidly in people’s lives, the privacy of information has become an important issue due to the accessibility of data. Therefore, to enhance information security...
详细信息
Deep learning has been used in many applications where patterns from past-trained data can be extracted to predict future outcomes. Deep learning is characterized by training and testing data with the identical input ...
详细信息
Gradient-based optimizer (GBO) is one of the most promising metaheuristic algorithms, where it proved its efficiency in various fields. GBO combine two major search mechanisms population-based and gradient-based Newto...
详细信息
ISBN:
(数字)9781665494113
ISBN:
(纸本)9781665494120
Gradient-based optimizer (GBO) is one of the most promising metaheuristic algorithms, where it proved its efficiency in various fields. GBO combine two major search mechanisms population-based and gradient-based Newton. Thus, it has a strong ability in global search. However, it suffers from dealing with local search problems. In this paper, a new version introduces which integrates the feature of Simulating annealing method (SA) with the GBO (GBOSA) to enhance the local search technique. The proposed GBOSA has been compared with various popular algorithms and improved variants on a set of real-world engineering problems. The experiment results show that GBOSA outperformed the other algorithms in the literature.
暂无评论