Kernel Methods for Regression

University essay from Linnéuniversitetet/Institutionen för matematik (MA)

Abstract: Kernel methods are a well-studied approach for addressing regression problems by implicitly mapping input variables into possibly infinite-dimensional feature spaces, particularly in cases where standard linear regression fails to capture non-linear relationships in data. Therefore, the choice between standard linear regression and kernel regression can be seen as a tradeoff between constraints on the number of features and the number of training samples. Our results show that the Gaussian kernel consistently achieves the lowest mean squared error for the largest considered training size. At the same time, the standard ridge regression exhibits a higher mean squared error and lower fit time. We have proven algebraically that the solutions of standard ridge regression and kernel ridge regression are mathematically equivalent.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)