With the rapid development of information technology and the rapid popularization of the Internet, while people enjoy the convenience and efficiency brought about by new technologies, they are also suffering from the ...
详细信息
With the rapid development of information technology and the rapid popularization of the Internet, while people enjoy the convenience and efficiency brought about by new technologies, they are also suffering from the harm caused by cyber attacks. In addition to efficiently thwarting network assaults, a high volume of complicated security event data might unintentionally increase the strain of policy makers. At present, NS threats mainly include network viruses, trojans, DOS (Denial-Of-Service), etc. For the increasingly complex Network Security (NS) problems, the traditional rule-based network monitoring technology is difficult to predict the unknown attack behavior. Environment-based, dynamic and integrated data fusion can integrate data from a macro perspective. In recent years, machinelearning (ML) technology has developed rapidly, which could easily train, test and predict existing third-party models. It uses ML algorithms to find out the association between data rather than manually sets rules. Support vector machine is a common ML method, which can predict the security of the network well after training and testing. In order to monitor the overall security status of the entire network, NS situation awareness refers to the real-time and accurate reproduction of network attacks using the reconstruction approach. Situation awareness technology is a powerful network monitoring and security technology, but there are many problems in the existing NS technology. For example, the state of the network cannot be accurately detected, and its change rule cannot be understood. In order to effectively predict network attacks, this paper adopted a technology based on ML and data analysis, and constructed a NS situational awareness model. The results showed that the detection efficiency of the model based on ML and data analysis was 7.18% higher than that of the traditional NS state awareness model.
In the preparation of high-performance polyurethane (PU) modified bitumen, due to the different kinds of PU modifiers, the design parameters of the preparation process are numerous, and indexes of the performance resp...
详细信息
In the preparation of high-performance polyurethane (PU) modified bitumen, due to the different kinds of PU modifiers, the design parameters of the preparation process are numerous, and indexes of the performance response need to be selected. As a result, the preparation process of PU-modified bitumen is not universally applicable. Therefore, according to different application environments, how determining the process parameters of the PU-modified bitumen accurately and efficiently is a key problem to be solved urgently. Based on fthe Kriging-PSO hybrid optimization algorithm, this paper proposed a novel design method for the preparation process for the PU-modified bitumen. The response indicators with high relative sensitivity (softening point, rutting factor, Brookfield viscosity, and dispersion coefficient) were screened by using range and variance analysis to improve the fitting accuracy of the Kriging-PSO model after training. Among them, the dispersion coefficient was evaluated by fluorescence microscopy test using the Christiansen coefficient method to evaluate the uniformity of the dispersed phase of the PU modifier. Through the Kriging-PSO algorithm, the main process parameters for preparing PU-modified bitumen in the laboratory were determined as follows: shearing time 86 min, shearing speed 2450 rpm, shearing temperature 148, and PU content 18.6%. The prepared PU-modified bitumen was placed in an oven at 100 for 2 h. The performance indicators of PU modified bitumen were: softening point 90, rutting factor 30 kPa, Brookfield viscosity 80,000 Pa center dot s, and dispersion coefficient 0.92. The PU-modified bitumen prepared by this optimal process met the expected performance indicators. The results of this paper showed that the Kriging-PSO algorithm provided a new idea for the design of a modified bitumen preparation process and achieve the purpose of designing the optimum process parameters of PU modified bitumen efficiently using fewer samples. Meanwhil
Optimization methods using metaheuristic algorithms have been widely used in steel frame design to improve the inefficient traditional design method due to repeated model tuning and massive mechanical analysis. Howeve...
详细信息
Optimization methods using metaheuristic algorithms have been widely used in steel frame design to improve the inefficient traditional design method due to repeated model tuning and massive mechanical analysis. However, the random search feature of them may easily result in poor performances. In this paper, combining metaheuristic algorithms and machinelearning methods, a highly integrated method based on an on-line model training, updating and parameter tuning process is proposed to improve the performance of the optimization algorithm with general forms and parameters. It reduces the impact of the iterative mechanism and parameter setting of metaheuristic algorithms on their performance. Such method is introduced to intelligent structural design of steel frames including three steps. The standard optimization process is conducted to search optimal design and simultaneously collect the mechanical analysis data of the structure. Then the data is adopted to generate and update surrogate models of structural responses dynamically while an analysis-based feature en-gineering and an automatic model tuning technique are employed to improve the model accuracy. Finally, a much more efficient procedure is presented to obtain potential solutions which are used to improve the convergence rate and performance of standard optimization. Four cases are used to study the effectiveness of the integrated method and the influence of different settings is discussed, as well as its generality. As a conclusion, the proposed method can achieve structural safety and economic benefit of steel frames, which exhibits supe-riorly in terms of robustness, optimal results and computational cost even in large-scale optimization problems of complicated frames.
In this study, the optimized decision-making system (OD-MS) algorithm in machinelearning for optimizing the enzymatic hydrolysis saccharification, fermentation conditions, and its yield was studied. Two hundred fifty...
详细信息
In this study, the optimized decision-making system (OD-MS) algorithm in machinelearning for optimizing the enzymatic hydrolysis saccharification, fermentation conditions, and its yield was studied. Two hundred fifty datasets were collected as training data from various studies and experimental results. Initially, the data of various biomass and product conditions were collected, and their correlation coefficient values were determined using the Pearson correlation matrix. Test datasets were analyzed in an optimized decision-making system to predict the data value, and the process was repeated until the desired data was achieved. 3-D surface analysis was performed to determine the product yield ranges based on biomass characteristics and process conditions. Maximum glucose yield (>50 g/L) and ethanol yield (>40 g/L) were achieved with the increase in cellulose (>73%), S-temp (55-60 degrees C), S-pH (7-9), S-shaking speed (180-200 rpm), F-pH (4.5), F-time (<40 h), and F-shaking speed (120-150 rpm) and decrease in hemicellulose (<10%), lignin (<10%), S-time (<20 h), and F-time (20 h). Weighted rank was assigned to biomass characteristics and process conditions to find the optimum parameter conditions using correlation values for obtaining a better yield. Two-step validation was done to find the accuracy of biomass characteristics and process conditions. A 95% of accuracy was found while comparing the actual dataset with the data predicted using the OD-MS algorithm for various biomass characteristics and process conditions. The R 2 value of this model was found to be 0.9762.
Floods are one of the most devastating natural disasters that cause immense damage to life, property and agriculture worldwide. Recurring floods in Bihar (a state in eastern India) during the monsoon season impact the...
详细信息
Floods are one of the most devastating natural disasters that cause immense damage to life, property and agriculture worldwide. Recurring floods in Bihar (a state in eastern India) during the monsoon season impact the agro-based economy, destroying crops and making it difficult for farmers to prepare for the next season. To mitigate the impact of floods on the agricultural sector, there is a need for early warning systems. Nowadays, remote sensing technology is used extensively for monitoring and managing flood events, which is also used in the present study. The random forest (RF) machinelearning (ML) algorithm has also been used for land-use classification, and its output is used as an input for flood impact assessment. Here, we have analysed the flood extents and their impact on agriculture using Sentinel-1 form. The present study shows that floods severely impacted a large part of Bihar during the monsoon seasons of 2020 and 2021. About 701,967 ha of land (614,706 ha agricultural land) in 2020 and 955,897 ha (851,663 ha agricultural land) in 2021 were severely flooded. An to visualise the results, which can help the government authorities prioritize relief and rescue operations.
In today's digital age, log files are crucial. However, the conversion of text log files into images has only recently been developed. The security of log files is a major source of concern, and the security of th...
详细信息
In today's digital age, log files are crucial. However, the conversion of text log files into images has only recently been developed. The security of log files is a major source of concern, and the security of the systems in which the logs are stored determines the safety of the log file in process mining. This calls for the first conversion of a text log file into an image file. Thus, this research aims to convert the log files into images in a mugshot database and detect illegal activity and criminals from the converted images employing a novel Convolutional Neural Network (CNN). The developed model has three stages: pre-processing, feature extraction, and detection and matching. The pre-processing was performed by min-max normalization, and in feature extraction, the deep learning method was used. Moreover, in the detection phase, CNN is employed for detecting illegal activities, and the matching process is performed for detecting illegal activities from converted images and criminals in the mugshot database. The model's performance is evaluated in terms of precision, F1-score, recall, and accuracy values of 99.6%, 98.5%, 98.7%, and 99.8%, respectively. A further comparison has been performed to show the effectiveness of the suggested model over other methods.
Background: Total laparoscopic anterior resection (tLAR) and natural orifice specimen extraction surgery (NOSES) has been widely adopted in the treatment of rectal cancer (RC). However, no study has been performed to ...
详细信息
Background: Total laparoscopic anterior resection (tLAR) and natural orifice specimen extraction surgery (NOSES) has been widely adopted in the treatment of rectal cancer (RC). However, no study has been performed to predict the short-term outcomes of tLAR using machine learning algorithms to analyze a national ***: Data from consecutive RC patients who underwent tLAR were collected from the China NOSES Database (CNDB). The random forest (RF), extreme gradient boosting (XGBoost), support vector machine (SVM), deep neural network (DNN), logistic regression (LR) and K-nearest neighbor (KNN) algorithms were used to develop risk models to predict short-term complications of tLAR. The area under the receiver operating characteristic curve (AUROC), Gini coefficient, specificity and sensitivity were calculated to assess the performance of each risk model. The selected factors from the models were evaluated by relative ***: A total of 4313 RC patients were identified, and 667 patients (15.5%) developed postoperative complications. The machinelearning model of XGBoost showed more promising results in the prediction of complication than other models (AUROC 0.90, P < 0.001). The performance was similar when internal and external validation was used. In the XGBoost model, the top four influential factors were the distance from the lower edge of the tumor to the anus, age at diagnosis, surgical time and comorbidities. In risk stratification analysis, the rate of postoperative complications in the high-risk group was significantly higher than in the medium-and low-risk groups (P < 0.001).Conclusion: The machinelearning model shows potential benefits in predicting the risk of complications in RC patients after tLAR. This novel approach can provide reliable individual information for surgical treatment recommendations.& COPY;2023 Published by Elsevier Ltd.
Breast cancer is a significant global health concern, and early detection is crucial for improving survival rates. The article reviews various studies that investigate different methods for detecting breast cancer usi...
详细信息
Breast cancer is a significant global health concern, and early detection is crucial for improving survival rates. The article reviews various studies that investigate different methods for detecting breast cancer using non-invasive imaging techniques with a focus on thermal imaging, including feature extraction, image segmentation, and machinelearning. It concludes that developing efficient and accurate breast cancer screening software requires an integrated approach to data collection, image processing, and machine learning algorithms. The article presents a novel technique for developing breast cancer screening software using Rotational Thermographic Imaging, dynamic temperature-based data collection, Colour-based Infrared Image Processing, and a machine learning algorithm to provide a complete breast imaging in sitting position to reduce the chances of missing abnormalities. Image processing and machinelearning techniques are utilised to extract a comprehensive relevant feature set from the captured images and used to train a machinelearning model. The system was tested on an increasing patient population in a clinical setting deployed at a hospital. The algorithm's performance was evaluated using several metrics, including sensitivity (82.14%), specificity (98.33%), and accuracy (93.27%). The results demonstrate that the proposed algorithm achieved high accuracy and sensitivity, making it a promising tool for breast cancer screening.
Rock thermal conductivity (TC) is a key parameter in geothermal, petroleum geology, and basin research. Although experimental analysis relying on core samples is currently an effective way to obtain the TC of rocks, i...
详细信息
Rock thermal conductivity (TC) is a key parameter in geothermal, petroleum geology, and basin research. Although experimental analysis relying on core samples is currently an effective way to obtain the TC of rocks, it is not always feasible as most boreholes have no or limited cores, making it difficult to establish TC model of geological body efficiently and accurately. This study sheds light on how machine learning algorithms can be used to accurately predict rock TC with easily accessible and high-resolution logging data. Based on 295 measured TC data, logging data, and vertical seismic profile (VSP) data collected from the whole-cored CSDP-2 borehole, the models including random forest (RF), convolutional neural network (CNN), support vector regression (SVR), and particle swarm optimization-SVR (PSO_SVR), were applied to predict the TC of the entire borehole section. Using a confusion matrix analysis, the primary-wave velocity (PWV) from the VSP survey, measured density, and logging data, including shallow lateral resistivity (LLS), compensated neutron log (CNL), density (DEN), gamma rays (GR), spontaneous potential (SP), and acoustic transit time (AC), were used as the input variables for training models and TC prediction. The results showed that the geophysical parameters reflecting those properties related to mineral composition, porosity and reservoir fluids of geological body can be well used to predict TC through machine learning algorithms. Regardless of unconsolidated sediments or rocks, the RF model showed stronger applicability and higher accuracy in TC prediction compared to the PSO_SVR and CNN models, while the SVR model showed poor applicability in this case study. The PWV data can effectively improve the accuracy of TC prediction of all models, and the RF model finally yielded the excellent performance with a correlation coefficient (> 0.86) and root mean squared error (8%) between the predicted and measured values. Future research should focus on
Intelligent algorithms have excellent performance in managing large databases and mining effective knowledge, and many researchers have introduced these algorithms into research to improve the performance of book mana...
详细信息
Intelligent algorithms have excellent performance in managing large databases and mining effective knowledge, and many researchers have introduced these algorithms into research to improve the performance of book management systems. This article introduced the application of machine learning algorithms in electronic book database management systems. A linear classifier (linear regression algorithm is one of machine learning algorithms) was used to analyze the empirical loss value. This paper studied the application of machine learning algorithm in the database management of library special collection resources, aiming to optimize the management of library special collection resources database through machine learning algorithm, and provided students with better experience of using special collection resources. This paper verified the effectiveness of the machine learning algorithm from three aspects: the retrieval time before and after the special resource database, the number of crashes in a month and the number of library staff. The average time of the five sets using the machine learning algorithm was about 150% faster than the average search time of the five sets of special resource databases. The number of crashes after using machine learning algorithm was 2, 1, 1, 2, 1, which was much lower than that of special resource database. After using the machine learning algorithm, the five groups of workers were 7, 5, 9, 6, and 4, respectively, far fewer than the five groups of special collectors. Finally, by comparing the linear regression model, Support vector machine model, and random forest model, it was found that the accuracy of the linear regression model reached 98.2%, an increase of 8.2% compared to the random forest model, the precision rate reached 96.7%, and the recall rate reached 98.2%. 97.8%, F1 value reached 97.5%.It can be seen from the experimental data that the machine learning algorithm plays a good role in the database management of the library sp
暂无评论