- The least mean squares (LMS) algorithms adjust the filter coefficients to minimize the cost function. Compared to recursive least squares (RLS) algorithms, the LMS algorithms do not involve any matrix operations.
- Therefore, the LMS algorithms require fewer computational resources and memory than the RLS algorithms.
- The implementation of the LMS algorithms also is less complicated than the RLS algorithms.
- However, the eigenvalue spread of the input correlation matrix, or the correlation matrix of the input signal, might affect the convergence speed of the resulting adaptive filter.
The standard LMS algorithm performs the following operations to update the coefficients of an adaptive filter:
- Calculates the output signal$y(n)$ from the adaptive filter.
- Calculates the error signal$e(n)$ by using the following equation:$e(n) = d(n)–y(n).$
- Updates the filter coefficients by using the following equation:
$\bar w (n+1)=\bar w(n)+\mu.e(n).\bar u(n)$
where μ is the step size of the adaptive filter, is the filter coefficients vector, and is the filter input vector.