

PREV PACKAGE NEXT PACKAGE  FRAMES NO FRAMES 
See:
Description
Class Summary  

Cache  SVM Kernel Cache, implementing a least recently used (LRU) policy. 
Kernel  Abstract interface to model an SVM Kernel. 
ONECLASSQ  Inner class representing a Kernel matrix for SVM Distribution Estimation. 
Solver  Generalized SMO+SVMlight algorithm Solves: min 0.5(\alpha^T Q \alpha) + b^T \alpha y^T \alpha = \delta y_i = +1 or 1 0 <= alpha_i <= Cp for y_i = 1 0 <= alpha_i <= Cn for y_i = 1 Given: Q, b, y, Cp, Cn, and an initial feasible point \alpha l is the size of vectors and matrices eps is the stopping criterion solution will be put in \alpha, objective value will be put in obj 
Solver.SolutionInfo  
SolverNU  Solver for nusvm classification and regression additional constraint: e^T \alpha = constant 
SVCQ  Inner class representing a Kernel matrix for Support Vector classification. 
SVM  Construct and solve various formulations of the support vector machine (SVM) problem. 
SVM.decisionFunction  Inner class modeling the data for the SVM decision function, used for classifying points with respect to the hyperplane. 
SVMCategorizer  Simple, easytouse, and efficient software for SVM classification and regression, based on the LIBSVM implementation of ChinChung Chang and ChinJen Lin. 
SVMModel  SVMModel encondes a classification model, describing both the model parameters and the Support Vectors. 
SVMNode  SVMNode is used to model dimentions in vectors. 
SVMParameter  Constants and Parameters used used in the SVM package. 
SVMProblem  Class to model an SVM Problem, containing both the training vectors and the class (value) associated with each vector. 
SVRQ  Inner class representing a Kernel matrix for Support Vector regression. 
Implementation of Support Vector Machines classification and regression that
can be used to categorize text files using NGrams as features.
A support vector machine is a supervised learning algorithm developed over the past
decade by Vapnik and others (Vapnik, Statistical Learning Theory, 1998).
It operates by mapping the given training set into a possibly highdimensional feature
space and attempting to locate in that space a plane that separates the positive
from the negative examples.


PREV PACKAGE NEXT PACKAGE  FRAMES NO FRAMES 