4.und 5.Vorlesung: Kernels, PCA, and kernel PCA

PCA: Principle Component Analysis is a linear technique to reduce data dimensionality. The main idea is to find the directions in the (high dimensional) space where the data variability is highest and ignore all other directions. We discuss two different ways to derive PCA: as the projection minimizing the squared error, and the one maximizing the data variance.

Literature on PCA: Classical PCA is covered in many statistics books:

Kernels: are very convenient similarity functions which automatically come together with an embedding in a high-dimensional space.

Kernel PCA: combines the kernel trick with PCA.

Literature on kernel PCA:

 

Demos: