### Abstract

Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝ^{d} that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d-k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik's statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren's bound of the number of sign sequences of real polynomials.

Original language | English |
---|---|

Pages (from-to) | 589-598 |

Number of pages | 10 |

Journal | Discrete and Computational Geometry |

Volume | 44 |

Issue number | 3 |

DOIs | |

Publication status | Published - Jan 1 2010 |

Externally published | Yes |

### All Science Journal Classification (ASJC) codes

- Theoretical Computer Science
- Geometry and Topology
- Discrete Mathematics and Combinatorics
- Computational Theory and Mathematics

## Fingerprint Dive into the research topics of 'VC Dimensions of Principal Component Analysis'. Together they form a unique fingerprint.

## Cite this

Akama, Y., Irie, K., Kawamura, A., & Uwano, Y. (2010). VC Dimensions of Principal Component Analysis.

*Discrete and Computational Geometry*,*44*(3), 589-598. https://doi.org/10.1007/s00454-009-9236-5