This article is part of the series Emerging Machine Learning Techniques in Signal Processing.

Open Access Research Article

Kernel Affine Projection Algorithms

Weifeng Liu* and JoséC Príncipe

Author Affiliations

Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA

For all author emails, please log on.

EURASIP Journal on Advances in Signal Processing 2008, 2008:784292  doi:10.1155/2008/784292


The electronic version of this article is the complete one and can be found online at: http://asp.eurasipjournals.com/content/2008/1/784292


Received: 27 September 2007
Revisions received: 23 January 2008
Accepted: 21 February 2008
Published: 12 March 2008

© 2008 The Author(s).

This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The combination of the famed kernel trick and affine projection algorithms (APAs) yields powerful nonlinear extensions, named collectively here, KAPA. This paper is a follow-up study of the recently introduced kernel least-mean-square algorithm (KLMS). KAPA inherits the simplicity and online nature of KLMS while reducing its gradient noise, boosting performance. More interestingly, it provides a unifying model for several neural network techniques, including kernel least-mean-square algorithms, kernel adaline, sliding-window kernel recursive-least squares (KRLS), and regularization networks. Therefore, many insights can be gained into the basic relations among them and the tradeoff between computation complexity and performance. Several simulations illustrate its wide applicability.

Publisher note

To access the full article, please see PDF.