Abstract: Regression analysis has been a major theoretical pillar for supervised machine learning as it is applicable to various identification and classification problems. Aiming at robust regressors, two major approaches have been adopted. The first category contains a variety of regularization techniques whose principle lies in incorporating both the error and penalty terms into the cost function. It is represented by the ridge regressor. Other prominent examples include RBF approximation networks by Poggio and Girosi  and Least-Squares SVM introduced by Suykens and Vandewalle .