During the service life of offshore jacket platforms, the harsh marine environment leads to severe structural corrosion and damage, necessitating structural health monitoring. Ensuring the accuracy of numerical finite...
详细信息
During the service life of offshore jacket platforms, the harsh marine environment leads to severe structural corrosion and damage, necessitating structural health monitoring. Ensuring the accuracy of numerical finite element models (FEM) requires critical model updating. This study introduces an improved DNN-OOA model updating method by incorporating actual structural responses into the optimization objective function and considering non-uniform corrosion of the structure. We utilized Pyansys to automatically generate large-scale datasets, simplifying the simulation process. An accurate and responsive surrogate model is generated using the improved deep neural network (DNN), and the optimal solution for the parameters to be corrected is sought through the osprey optimization algorithm (OOA), completing the FEM updating. The main innovation of this study lies in incorporating non-uniform corrosion caused by the real marine physical environment into the model updating process. This phenomenon is employed to determine the updating range for different structural members. Furthermore, the parameters subject to updating include structural damage to the members and changes in the upper mass. Incorporating the structural response under static loading into the optimization objective function allows for a more comprehensive reflection of the structure's dynamic and static behavior, addressing the regression confusion problem in the optimization process of purely modal frequency updating. Experimental results demonstrate that the proposed improved DNN-OOA model updating method effectively eliminates inaccuracies in simulated structural responses and mitigates the local optimum problem inherent in pure modal frequency updating. In the updated scaled jacket platform FEM, the maximum relative error of the modal frequencies is reduced to 2.624%, and the maximum error in structural response is reduced to 3.510%. This approach provides a more accurate and reliable FEM for the maint
Crowd Density Detection in Smart Video Surveillance involves advanced computer vision (CV) techniques to improve the efficiency and accuracy of crowd monitoring. The system assists in detecting and analyzing crowd den...
详细信息
Crowd Density Detection in Smart Video Surveillance involves advanced computer vision (CV) techniques to improve the efficiency and accuracy of crowd monitoring. The system assists in detecting and analyzing crowd density in real-time by utilizing artificial intelligence and machine learning (ML) models on surveillance videos. It detects crowded areas, manages crowd flow, and combines automated analysis with human oversight for improved public safety and early intervention. Explainable Artificial Intelligence (XAI) improves the interpretability and transparency of crowd management methods. Incorporating XAI models provides clear, understandable insights into predictions, ensuring more actionable and reliable crowd management. This study proposes an osprey optimization algorithm with Deep Learning Assisted Crowd Density Detection and Classification (OOADL-CDDC) technique for smart video surveillance systems. The aim of the OOADL-CDDC technique is to enable the automated and efficient detection of distinct kinds of crowd densities. To achieve this, the OOADL-CDDC technique primarily utilizes a bilateral filtering (BF) approach for noise removal process. The OOADL-CDDC technique utilizes an advanced DL method, employing the SE-DenseNet model for feature extraction, while the hyperparameter selection is performed by using the OOA model. Finally, the detection and classification of the crowd density is accomplished by using the attention bidirectional gated recurrent unit (ABiGRU) model. A series of experiments are performed to demonstrate the improved performance of the OOADL-CDDC method. The performance validation of the OOADL-CDDC technique portrayed a superior accuracy value of 98.30% over existing models in terms of distinct measures.
To address the shortcomings of the dung beetle optimizer, such as low convergence precision and a tendency to fall into local optima, a multi-strategy improved dung beetle optimizer (IDBO) is proposed. Firstly, a Cubi...
详细信息
To address the shortcomings of the dung beetle optimizer, such as low convergence precision and a tendency to fall into local optima, a multi-strategy improved dung beetle optimizer (IDBO) is proposed. Firstly, a Cubic chaos mapping strategy is introduced to enhance the diversity of the initial population;secondly, a global exploration strategy from the osprey optimization algorithm is incorporated, endowing the dung beetle algorithm with the ability to identify the best areas and escape from local optima, which preliminarily improves the convergence speed and optimization precision of the algorithm;finally, an adaptive t-distribution perturbation strategy is adopted to disturb the foraging behavior of the dung beetles, allowing the algorithm to further accelerate the convergence speed while enhancing global exploitation and local exploration capabilities. The effectiveness of the three improvement strategies is verified through testing and analysis with the CEC2021 and CEC2017 test functions, and a convergence analysis of the improved algorithm's optimization results compared to other algorithms is conducted. The Wilcoxon rank-sum test demonstrates that the IDBO algorithm has good convergence speed and optimization precision. Moreover, the IDBO algorithm is used to optimize the parameters of the HKELM prediction model and applied to short-term photovoltaic power generation prediction simulation comparison experiments. The experimental results show that compared to the DBO-HKELM prediction model, the error metrics MAE and RMSE of the IDBO-HKELM are reduced by 43.95% and 50.79% respectively, further verifying the feasibility and effectiveness of the IDBO algorithm in solving practical application problems.
A multi-level inverter (MLI) is essential to improve the performance and efficiency of the inverter, which is widely accepted for high-power and high-voltage applications. Its performance is significantly superior due...
详细信息
A multi-level inverter (MLI) is essential to improve the performance and efficiency of the inverter, which is widely accepted for high-power and high-voltage applications. Its performance is significantly superior due to higher DC link voltages, lower electromagnetic interference, and lower harmonic distortion. However, it has some drawbacks, like voltage balancing problems, complex pulse width modulation methods, and an increased number of components. This paper investigated the selective elimination of harmonics in optimization-based cascaded H-bridge (CHB) 27-level inverters for hybrid renewable energy sources. This method uses hybrid renewable energy sources such as solar photovoltaics, wind, and tidal turbines with single maximum power point tracking (MPPT). A single MPPT with a Dwarf Mungo optimization (DMO) algorithm is implemented to obtain high performance from hybrid sources simultaneously. The proposed inverter comprises ten semiconductor switches connected to three direct current (DC) sources. The osprey optimization algorithm (OOA) is used in the 27-level inverter to calculate the switching angles for selective harmonic elimination of pulse width modulation (SHE-PWM). The proposed approach is used on a Simulink platform to assess performance, and its efficacy is demonstrated by contrasting it with an existing approach. The simulation obtained 1.98% THD by using OOA with SHE-PWM. The total cost obtained by the proposed method is 40.03$, and the computational time for DMO and OOA is 1 s and 1.2 s at the 100th iteration. The experimental validation is established using a dSPACE RTI1104 controller to validate the proposed method. The experimental results obtained 1.99% of harmonics with the same output voltage as the simulation.
Colorectal cancer (CRC) is the second popular cancer in females and third in males, with an increased number of cases. Pathology diagnoses complemented with predictive and prognostic biomarker information is the first...
详细信息
Colorectal cancer (CRC) is the second popular cancer in females and third in males, with an increased number of cases. Pathology diagnoses complemented with predictive and prognostic biomarker information is the first step for personalized treatment. Histopathological image (HI) analysis is the benchmark for pathologists to rank colorectal cancer of various kinds. However, pathologists' diagnoses are highly subjective and susceptible to inaccurate diagnoses. The improved diagnosis load in the pathology laboratory, incorporated with the reported intra- and inter-variability in the biomarker assessment, has prompted the quest for consistent machine-based techniques to be integrated into routine practice. In the healthcare field, artificial intelligence (AI) has achieved extraordinary achievements in healthcare applications. Lately, computer-aided diagnosis (CAD) based on HI has progressed rapidly with the increase of machine learning (ML) and deep learning (DL) based models. This study introduces a novel Colorectal Cancer Diagnosis using the Optimal Deep Feature Fusion Approach on Biomedical Images (CCD-ODFFBI) method. The primary objective of the CCD-ODFFBI technique is to examine the biomedical images to identify colorectal cancer (CRC). In the CCD-ODFFBI technique, the median filtering (MF) approach is initially utilized for noise elimination. The CCD-ODFFBI technique utilizes a fusion of three DL models, MobileNet, SqueezeNet, and SE-ResNet, for feature extraction. Moreover, the DL models' hyperparameter selection is performed using the osprey optimization algorithm (OOA). Finally, the deep belief network (DBN) model is employed to classify CRC. A series of simulations is accomplished to highlight the significant results of the CCD-ODFFBI method under the Warwick-QU dataset. The comparison of the CCD-ODFFBI method showed a superior accuracy value of 99.39% over existing techniques.
The major prevalent primary bone cancer is osteosarcoma. Preoperative chemotherapy is accompanied by resection as part of the normal course of treatment. The diagnosis and treatment of patients are based on the chemot...
详细信息
The major prevalent primary bone cancer is osteosarcoma. Preoperative chemotherapy is accompanied by resection as part of the normal course of treatment. The diagnosis and treatment of patients are based on the chemotherapy reaction. Contrarily, chemotherapy without operation results in persistent cancer and an osteosarcoma regrowth. Thus, osteosarcoma patients should receive comprehensive therapy, which includes tumor-free surgery and global chemotherapy, to improve their survival. Hence, early diagnosis and individualized care of osteosarcoma are essential since they may lead to more effective therapies and higher survival rates. Here, the main goal of the recommended research is to use a unique deep learning approach to predict the osteosarcoma on histology images. Initially, the data is collected from the navigation confluence mobile osteosarcoma data of UT Southwestern/UT Dallas dataset. Next, the pre-processing of the collected images is accomplished by the Weiner filter technique. Further, the segmentation for the pre-processed images is done by the 2D Otsu's method. From the segmented images, the features are extracted by the linear discriminant analysis (LDA) approach. These extracted features undergo the final prediction phase that is accomplished by the novel improved recurrent gated recurrent unit (IGRU), in which the parameter tuning of GRU is accomplished by the osprey optimization algorithm (OOA) with the consideration of error minimization as the major objective function. On contrast with various conventional methods, the simulation findings demonstrate the effectiveness of the developed model in terms of numerous analysis.
The hybrid approach presented in this study combines the osprey optimization algorithm (OOA) with L evy flight optimization. It is possible to mimic an osprey's natural behavior by using the ospreyoptimization Al...
详细信息
The hybrid approach presented in this study combines the osprey optimization algorithm (OOA) with L evy flight optimization. It is possible to mimic an osprey's natural behavior by using the osprey optimization algorithm (OOA) approach. This hunting approach is used by ospreys to locate their prey, hunt it, and then position it so that it may be consumed. The stride length during a L evy's flight, a specific kind of random walk, can be described by a heavy-tailed probability distribution. OOA is the primary algorithm of the suggested approach, which is coupled with the L evy flight opti-mization technique. We refer to this method as MOOA. The suggested approach is used for droop control's Proportional-Integral (PI) secondary control optimization. This article compares the Aquila Optimizer (AO), Whale optimizationalgorithm (WOA), Marine Predator algorithm (MPA), Golden Jackal optimizationalgorithm (GJO), and Reptile Search algorithm (RSA) to assess the efficacy of the suggested approach. Performance tests on droop control are compared with convergence curves in this test. The simulation results show that the proposed method performs strongly and promisingly when applied to secondary control in droop control
Nowadays, the demand for pepper keeps on increasing with the increase in human population. Accurate diagnosis, flawless identification, and early detection of the lesions will improve the income of farmers. At present...
详细信息
Nowadays, the demand for pepper keeps on increasing with the increase in human population. Accurate diagnosis, flawless identification, and early detection of the lesions will improve the income of farmers. At present, deep learning (DL) based techniques assist farmers in identifying plant diseases with low cost and minimal time complexity. Hence, this study proposes a novel optimized DL model for classifying the presence and absence of pepper leaf disease using an effective feature learning process. The proposed study undergoes four major stages namely Pre-processing, Segmentation, Feature extraction, and Classification. In the pre-processing stage, initially, the input images are resized and the Improved Contrast Limited Adaptive Histogram Equalization (ICLAHE) technique is introduced to enhance the quality of the pepper leaf images. Then, the Kernelized Gravity-based Density Clustering (KGDC) technique is conquered to segment the diseased portions from the leaf images. Finally, the Gated Self-Attentive Convoluted MobileNetV3 (GSAtt-CMNetV3) technique is proposed to extract the features and classify the pepper leaf disease accurately. Moreover, a novel osprey optimization algorithm (Os-OA) is introduced to tune the parameters of the proposed DL model for enhancing the classification performance. The proposed study is implemented via the Python platform, and a publicly available Plant-Village dataset is utilized for the simulation process. Accuracy, precision and recall values achieved by the proposed pepper leaf disease classification for training percent 80 is 97.87%, 96.87% and 97.08% respectively.
In today's world, meta-heuristic methods are widely used to solve complex optimization problems in many different fields. Their most important advantage is their easy integration into different problems, allowing ...
详细信息
In today's world, meta-heuristic methods are widely used to solve complex optimization problems in many different fields. Their most important advantage is their easy integration into different problems, allowing them to provide effective solutions to various issues. With the continuous development of algorithms, they have become an essential tool for optimization and design, offering efficient solutions to various problems. Particularly, the use of meta-heuristic algorithms is of significant importance in solving engineering design problems. This study focuses on obtaining the optimal design of the lower control arm, using meta-heuristic methods. Topology optimization followed by shape optimization was performed to achieve the optimal design. To obtain the optimal dimensions, a genetic algorithm and the newly developed osprey optimization algorithm from artificial intelligence optimizationalgorithms were used. This is the first application of the osprey optimization algorithm. When the optimization results are examined, the maximum stress of the lower control arm after topology optimization is 268 MPa. Its weight has been reduced from 1402 to 1281 g, a decrease of 8.6%. According to the genetic algorithm result, the maximum stress of the optimum control arm is 266.6 MPa. Its weight is calculated as 1201 g, which is 14.33% lighter than the current model. According to the osprey optimization algorithm results, the maximum stress value of the lower control arm is 250.2 MPa and its weight is determined to be 1197.5 g. Thus, a suspension arm design 14.6% lighter than the initial model has been achieved with the osprey optimization algorithm.
Currently, the Internet of Things (IoT) generates a huge amount of traffic data in communication and information technology. The diversification and integration of IoT applications and terminals make IoT vulnerable to...
详细信息
Currently, the Internet of Things (IoT) generates a huge amount of traffic data in communication and information technology. The diversification and integration of IoT applications and terminals make IoT vulnerable to intrusion attacks. Therefore, it is necessary to develop an efficient Intrusion Detection System (IDS) that guarantees the reliability, integrity, and security of IoT systems. The detection of intrusion is considered a challenging task because of inappropriate features existing in the input data and the slow training process. In order to address these issues, an effective meta heuristic based feature selection and deep learning techniques are developed for enhancing the IDS. The osprey optimization algorithm (OOA) based feature selection is proposed for selecting the highly informative features from the input which leads to an effective differentiation among the normal and attack traffic of network. Moreover, the traditional sigmoid and tangent activation functions are replaced with the Exponential Linear Unit (ELU) activation function to propose the modified Bi-directional Long Short Term Memory (Bi-LSTM). The modified Bi-LSTM is used for classifying the types of intrusion attacks. The ELU activation function makes gradients extremely large during back-propagation and leads to faster learning. This research is analysed in three different datasets such as N-BaIoT, Canadian Institute for Cybersecurity Intrusion Detection Dataset 2017 (CICIDS-2017), and ToN-IoT datasets. The empirical investigation states that the proposed framework obtains impressive detection accuracy of 99.98 %, 99.97 % and 99.88 % on the N-BaIoT, CICIDS-2017, and ToN-IoT datasets, respectively. Compared to peer frameworks, this framework obtains high detection accuracy with better interpretability and reduced processing time.
暂无评论