VC Dimensions of Principal Component Analysis

Yohji Akama, Kei Irie, Akitoshi Kawamura, Yasutaka Uwano

研究成果: ジャーナルへの寄稿記事

5 引用 (Scopus)

抜粋

Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝd that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d-k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik's statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren's bound of the number of sign sequences of real polynomials.

元の言語英語
ページ(範囲)589-598
ページ数10
ジャーナルDiscrete and Computational Geometry
44
発行部数3
DOI
出版物ステータス出版済み - 1 1 2010
外部発表Yes

    フィンガープリント

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Geometry and Topology
  • Discrete Mathematics and Combinatorics
  • Computational Theory and Mathematics

これを引用