An Algorithm to perform Covariance-Adjusted Support Vector Classification in Non-Euclidean Spaces

arXiv:2504.04371v3 Announce Type: replace-cross
Abstract: Traditional Support Vector Machine (SVM) classification is carried out by finding the max-margin classifier for the training data that divides the margin space into two equal sub-spaces. This study demonstrates limitations of performing Support Vector Classification in non-Euclidean spaces by establishing that the underlying principle of max-margin classification and Karush Kuhn Tucker (KKT) boundary conditions are optimal only in the Euclidean vector spaces. The study establishes a methodology to perform Support Vector Classification in Non-Euclidean Spaces by incorporating data covariance into the optimization problem using Cholesky Decomposition of respective class covariance structure. It also demonstrates that in non-Euclidean spaces KKT modelling is sub-optimal as the principle of maximum margin is a function of intra-class data covariances and the classifier obtained separates the margin space in ratio of the respective class population covariance matrix. The study proposes an algorithm to iteratively estimate the population covariance-adjusted SVM classifier in non-Euclidean space from sample covariance matrices of the training data. The effectiveness of this SVM classification approach is demonstrated by applying the classifier on multiple datasets and comparing the performance with traditional SVM kernels and whitening algorithms. The Cholesky-SVM model shows marked improvement in the accuracy, precision, F1 scores and ROC performance compared to linear and other kernel SVMs.

Liked Liked