High-dimensional Feature Selection Problems (HFSPs) have grown in popularity but remain challenging. When faced with such complex situations, the majority of currently employed Feature Selection (FS) methods for these...
详细信息
High-dimensional Feature Selection Problems (HFSPs) have grown in popularity but remain challenging. When faced with such complex situations, the majority of currently employed Feature Selection (FS) methods for these problems drastically underperform in terms of effectiveness. To address HFSPs, a new Binary variant of the Ali Baba and the Forty Thieves (Baft) algorithm known as binary adaptive elite opposition-based aft (BAEOaft), incorporating historical information and dimensional mutation is presented. The entire population is dynamically separated into two subpopulations in order to maintain population variety, and information and knowledge about individuals are extracted to offer adaptive and dynamic strategies in both subpopulations. Based on the individuals' history knowledge, Adaptive Tracking Distance (ATD) and Adaptive Perceptive Possibility (APP) schemes are presented for the exploration and exploitation subpopulations. A dynamic dimension mutation technique is used in the exploration subpopulation to enhance BAEOaft's capacity in solving HFSPs. Meanwhile, the exploratory subpopulation uses Dlite Dynamic opposite Learning (EDL) to promote individual variety. Even if the exploitation group prematurely converges, the exploration subpopulation's variety can still be preserved. The proposed BAEOaft-based FS technique was assessed by utilizing the k-nearest neighbor classifier on 20 HFSPs obtained from the UCI repository. The developed BAEOaft achieved classification accuracy rates greater than those of its competitors and the conventional Baft in more than 90% of the applied datasets. Additionally, BAEOaft outperformed its rivals in terms of reduction rates while selecting the fewest number of features.
Feature Selection (FS) aims to ameliorate the classification rate of dataset models by selecting only a small set of appropriate features from the initial range of features. In consequence, a reliable optimization met...
详细信息
Feature Selection (FS) aims to ameliorate the classification rate of dataset models by selecting only a small set of appropriate features from the initial range of features. In consequence, a reliable optimization method is needed to deal with the matters involved in this problem. Often, traditional methods fail to optimally reduce the high dimensionality of the feature space of complex datasets, which lead to the elicitation of weak classification models. Meta-heuristics can offer a favorable classification rate for high-dimensional datasets. Here, a binary version of a new human-based algorithm named Ali Baba and the Forty Thieves (aft) was applied to tackle a pool of FS problems. Although aft is an efficient meta-heuristic for optimizing many problems, it sometimes exhibits premature convergence and low search performance. These issues were mitigated by proposing three enhanced versions of aft, namely: (1) A Binary Multi-layered aft called BMaft which uses hierarchical and distributed frameworks, (2) Binary Elitist aft (BEaft) which uses an elitist learning strategy, and, (3) Binary Self-adaptive aft (BSaft) which uses an adapted tracking distance parameter. These versions along with the basic Binary aft (Baft) were expansively assessed on twenty-four problems gathered from different repositories. The results showed that the proposed algorithms substantially enhance the performance of Baft in terms of convergence speed and solution accuracy. On top of that, the overall results showed that BMaft is the most competitive, which provided the best results with excellent performance scores compared to other competing algorithms.
Learning neural networks (NNs) is one of the most challenging problems in machine learning and has lately drawn the attention of many academics. The nonlinear structure of NNs and the unknown optimal set of key govern...
详细信息
Feature selection (FS) is generally associated with the process of using a probabilistic method to select optimal feature combinations during pre-processing steps in data mining. This technique can optimize the datase...
详细信息
Feature selection (FS) is generally associated with the process of using a probabilistic method to select optimal feature combinations during pre-processing steps in data mining. This technique can optimize the datasets' features that need to be considered to heighten the performance of classification on the grounds of the selected optimal feature set. In this paper, a hybridization model is evolved and applied to select the optimal feature subset based on a binary version of the Hybrid Memory Improved Chameleon Swarm algorithm (CSA) (HMICSA) and the k-Nearest Neighbor (k-NN) classifier. In this FS model, the following are proposed and applied: (1) Four kinds of transfer functions, (2) Amendments to the velocity of the CSA's individuals, (3) Addition of internal memory to the CSA's individuals, and (4) Hybridization of CSA with Ali baba and the Forty Thieves (aft) algorithm. These actions are aimed to strike an adequate equipoise between global exploration and local exploitation conducts of the search space of the basic CSA. This is to mitigate the problem of early convergence, and to sidestep trapping into a local optima in CSA. The efficacy of the proposed FS algorithm was evaluated on 24 medical diagnosis benchmark datasets collected from different specialized repositories and compared with other k-NN-based FS methods. The all-inclusive outcomes using various evaluation methods disclose the competence of the proposed method in augmenting the classification performance compared to other methods, ensuring its potential in scouting the feature space and designating the most useful features for classification tasks.
暂无评论