The Least Mean Square (LMS) algorithm has an inherent trade-off issue between convergence speed and steady-state error performance. One of the algorithms proposed to tackle this issue is called the noiseconstrained L...
详细信息
ISBN:
(纸本)9780738131269
The Least Mean Square (LMS) algorithm has an inherent trade-off issue between convergence speed and steady-state error performance. One of the algorithms proposed to tackle this issue is called the noiseconstrained LMS algorithm, which uses the noise variance to iteratively vary the stepsize. This work uses the q-derivative to propose an improved noiseconstrained LMS algorithm. Simulation results show that the proposed algorithm shows better performance than the conventional algorithm at the cost of only a minimal increase in complexity. Steady-state analysis for the proposed algorithm has also been carried out.
暂无评论