TX EduXpert - шаблон joomla Новости

LangSwitcher

Linear Discriminant Analysis

Fisher’s linear discriminant analysis (FLDA) is one of the well-known methods to extract the best features for multi-class discrimination. Recently Kernel discriminant analysis (KDA) has been successfully applied in many applications. KDA is one of the nonlinear extensions of FLDA and construct nonlinear discriminant mapping by using kernel functions. Otsu derived the optimum nonlinear discriminant analysis (ONDA) by assuming the underlying probabilities similar with the Bayesian decision theory. In this paper, we propose to construct an approximation of the optimum nonlinear discriminant mapping based on Otsu's theory of the nonlinear discriminant analysis. We use k nearest neighbor(k-NN) to estimate Bayesian posterior probabilities. In experiment, we show classification performance of the proposed nonlinear discriminant analysis for several modified k-NN.