High quality performance of image segmentation methods presents one leading priority in design and implementation of image analysis systems. Incorporating the most important image data information into segmentation pr...
详细信息
High quality performance of image segmentation methods presents one leading priority in design and implementation of image analysis systems. Incorporating the most important image data information into segmentation process has resulted in development of innovative frameworks such as fuzzy systems, rough systems and recently rough - fuzzy systems. Data analysis based on rough and fuzzy systems is designed to apprehend internal data structure in case of incomplete or uncertain information. Rough entropy framework proposed in [12, 13] has been dedicated for application in clustering systems, especially for image segmentation systems. We extend that framework into eight distinct rough entropy measures and related clustering algorithms. The introduced solutions are capable of adaptive incorporation of the most important factors that contribute to the relation between data objects and makes possible better understanding of the image structure. In order to prove the relevance of the proposed rough entropy measures, the evaluation of rough entropy segmentations based on the comparison with human segmentations from Berkeley and Weizmann image databases has been presented. At the same time, rough entropy based measures applied in the domain of image segmentation quality evaluation have been compared with standard image segmentation indices. Additionally, rough entropy measures seem to comprehend properly properties validated by different image segmentation quality indices.
Interconnected renewable energy sources (RES) require fast and accurate fault ride through (FRT) operation, in order to support the power grid, when faults occur. This paper proposes an adaptive phase-locked loop (ada...
详细信息
Interconnected renewable energy sources (RES) require fast and accurate fault ride through (FRT) operation, in order to support the power grid, when faults occur. This paper proposes an adaptive phase-locked loop (adaptive d alpha beta PLL) algorithm, which can be used for a faster and more accurate response of the grid-side converter (GSC) control of a RES, particularly under FRT operation. The adaptive d alpha beta PLL is based on modifying the tuning parameters of the d alpha beta PLL, according to the type and voltage characteristics of the grid fault, with the purpose of accelerating the performance of the PLL algorithm. The proposed adaptive tuning mechanism adjusts the PLL parameters in real time, according to the proposed fault classification unit, in order to accelerate the synchronization performance. The beneficial effect of the proposed adaptive tuning mechanism on the performance of d alpha beta PLL is verified through simulation and experimental results. Furthermore, the benefits of using a faster synchronization method on the control of the GSC of RES are also demonstrated in this paper. Additionally, a new synchronization technique named frequency-phase decoupling (FPD)-d alpha beta PLL is presented, which applies an FPD technique from the literature in the structure of d alpha beta PLL, with the purpose of improving the performance of d alpha beta PLL. Finally, the adaptive tuning mechanism proposed in this paper is combined with the FPD-d alpha beta PLL in order to introduce the adaptive FPD-d alpha beta PLL, which presents an even faster time performance and is an ideal solution for the synchronization of RES under grid faults.
Coordinate descent (CD) is a simple optimization technique suited to low complexity requirements and also for solving large problems. In randomized version, CD was recently shown as very effective for solving least-sq...
详细信息
Coordinate descent (CD) is a simple optimization technique suited to low complexity requirements and also for solving large problems. In randomized version, CD was recently shown as very effective for solving least-squares (LS) and other optimization problems. We propose here an adaptive version of randomized coordinate descent (RCD) for finding sparse LS solutions, from which we derive two algorithms, one based on the lasso criterion, the other using a greedy technique. Both algorithms employ a novel way of adapting the probabilities for choosing the coordinates, based on a matching pursuit criterion. Another new feature is that, in the lasso algorithm, the penalty term values are built without knowing the noise level or using other prior information. The proposed algorithms use efficient computations and have a tunable trade-off between complexity and performance through the number of CD steps per time instant. Besides a general theoretical convergence analysis, we present simulations that show good practical behavior, comparable to or better than that of state of the art methods.
In this paper a prototype of an active headrest system is studied. First, the plant is discussed and its basic characteristics are presented. Then, adaptive control techniques for acoustic noise control are employed. ...
详细信息
In this paper a prototype of an active headrest system is studied. First, the plant is discussed and its basic characteristics are presented. Then, adaptive control techniques for acoustic noise control are employed. An efficient two-stage algorithm generating zones of highest noise attenuation at desired locations is proposed and compared with other control systems. Both SISO and MIMO systems are considered. The latter are found to be more appropriate. Obtained results are illustrated in different forms. Distribution of the zones of quiet is presented and discussed. (C) 2004 Elsevier Ltd. All rights reserved.
In this letter, we propose a novel adaptive reduced-rank strategy based on joint iterative optimization (JIO) of filters according to the minimization of the bit error rate (BER) cost function. The proposed optimizati...
详细信息
In this letter, we propose a novel adaptive reduced-rank strategy based on joint iterative optimization (JIO) of filters according to the minimization of the bit error rate (BER) cost function. The proposed optimization technique adjusts the weights of a subspace projection matrix and a reduced-rank filter jointly. We develop stochastic gradient (SG) algorithms for their adaptive implementation and introduce a novel automatic rank selection method based on the BER criterion. Simulation results for direct-sequence code-division-multiple-access (DS-CDMA) systems show that the proposed adaptive algorithms significantly outperform the existing schemes.
With increasing demand for higher data rate, modern communication systems have grown more complex. Equalization has become more and more important as it is effective in mitigating the multipath fading often occurred i...
详细信息
With increasing demand for higher data rate, modern communication systems have grown more complex. Equalization has become more and more important as it is effective in mitigating the multipath fading often occurred in high-data-rate communication systems. However, the implementation complexity of adaptive equalizers is usually too high for mobile communication applications. In this paper, a novel adaptive equalization algorithm and its low-complexity architecture are proposed. This algorithm employs a new grouped signed power-of-two (GSPT) number representation. The GSPT algorithm and several enhanced versions are simulated as adaptive equalizers in a phase-shift keying communication receiver for several practical channels and the GSPT-based equalizers perform as well as the least mean square (LMS) equalizer. Moreover, for comparison, two GSPT-based equalizers and two other equalizers are implemented in field-programmable gate arrays. The GSPT-based equalizers require only about 25%-30% of the hardware resources needed in the LMS equalizer. Also the GSPT-based equalizers are more than twice as fast as the LMS equalizer.
Over recent years, many advancements in networks have taken place with now the advent of 6G. With these growing advancements, challenges in managing networks also emerge. Network Digital Twins (NDTs) is one of the pot...
详细信息
ISBN:
(数字)9798350368369
ISBN:
(纸本)9798350368376
Over recent years, many advancements in networks have taken place with now the advent of 6G. With these growing advancements, challenges in managing networks also emerge. Network Digital Twins (NDTs) is one of the potential technology in solving many of these challenges because of the capability to virtually replicate the physical network elements. One of the key challenges is predicting network traffic especially with the exponentially growing number of devices in the network. In this paper, we study and show how varying the “look-back period” and “forecast horizon” significantly affects traffic predictions. Look-back period or window size is how much of the historical data is used by a prediction algorithm to make predictions. Forecast horizon is how far in the future we can predict. These highly impact how accurately traffic is predicted in networks and as well significantly determine how well Digital Twins (DTs) for networks accurately reflect traffic of the physical network.
We study the problem of Multi-Armed Bandits (MAB) with reward distributions belonging to a One-Parameter Exponential Distribution (OPED) family. In the literature, several criteria have been proposed to evaluate the p...
详细信息
Given a graph G and a seed node vs, the objective of local graph clustering (LGC) is to identify a subgraph Cs ∈ G (a.k.a. local cluster) surrounding vs in time roughly linear with the size of Cs. This approach yield...
详细信息
The emergence of big data has completely changed the way data analysis is done. As a result, there is a pressing need for effective clustering algorithms that can handle large datasets from different fields. Efficienc...
详细信息
ISBN:
(数字)9798331508685
ISBN:
(纸本)9798331519476
The emergence of big data has completely changed the way data analysis is done. As a result, there is a pressing need for effective clustering algorithms that can handle large datasets from different fields. Efficiency, scalability, and finding meaningful patterns in enormous data are some of the important difficulties that this research tackles in relation to big data clustering. We thoroughly examine and evaluate a range of clustering algorithms, from more conventional ones like hierarchical and K-Means clustering to more cutting-edge ones like density-based spatial clustering (DBSCAN) and model-based clustering. To further improve the performance and scalability of clustering algorithms, we investigate new methods that make use of distributed computing frameworks like Apache Spark and MapReduce. We offer a thorough assessment of their efficacy in terms of clustering quality, computational efficiency, and resource use by using and evaluating these methods on large-scale datasets. While conventional clustering algorithms may fail miserably when faced with extremely big data sets, our research shows that newer methods that make use of optimization techniques and distributed processing greatly outperform their predecessors. The effect of feature selection and dimensionality reduction techniques on improving clustering results is also shown. By shedding light on how to choose and implement clustering algorithms for effective data mining and pattern identification, this study adds to the continuing endeavors in the domain of big data analytics. Progress in areas like customer segmentation, anomaly detection, and social network analysis can be achieved with the help of adaptive algorithms that can handle large datasets while keeping clustering results accurate and dependable, according to the study.
暂无评论