Improved KPCA algorithm based on numerical approximation
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • | |
  • Comments
    Abstract:

    Though kernel methods have been widely used for pattern recognition,they suffer from the problem that the extraction efficiency is in inverse proportion to the size of the training sample set.To solve it,we propose a novel improvement to Kernel Principle Component Analysis (KPCA) based on numerical approximation.The method is on the base of the assumption that the discriminant vector in the feature space can be approximately expressed by a certain linear combination of some constructed virtual sample vectors.We determine these virtual sample vectors one by one by using a very simple and computationally efficient iterative algorithm.When they are dissimilar to each other,this set is able to well replace the role of the whole training sample set in expressing the discriminant vector in the feature space.It is remarkable that the determined virtual sample vectors lead to a good improvement to KPCA,which allows an efficient feature extraction procedure to be obtained.Also,we need only to set the initial values of the virtual sample vectors to random values.The experiments on two benchmark datasets show that our method can achieve the goal of efficient feature extraction as well as good and stable classification accuracy.

    Reference
    Related
    Cited by
Get Citation

ZHAO Yingnan, WANG Shuiping, ZHENG Yu. Improved KPCA algorithm based on numerical approximation[J]. Journal of Nanjing University of Information Science & Technology,2012,4(4):362-365

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:July 20,2011
Article QR Code

Address:No. 219, Ningliu Road, Nanjing, Jiangsu Province

Postcode:210044

Phone:025-58731025