搜索资源列表
svm_v0.55beta
- 最新的支持向量机工具箱,有了它会很方便 1. Find time to write a proper list of things to do! 2. Documentation. 3. Support Vector Regression. 4. Automated model selection. REFERENCES ========== [1] V.N. Vapnik, \"The Nature of Statistical Learning Theory\", Springer-Verl
SVM_SteveGunn.向量机的基本理论
- 支持向量机的基本理论是从二类分类问题提出的,常用的核函数有:多项式、径向基、Sigmoid型。对于同一组数据选择不同的核函数,基本上都可以得到相近的训练效果。,Support vector machine' s basic theory is the question of second-class classification, commonly used kernel functions include: polynomial, radial basis, Sigmoid type. For
kwiener
- The following code implements a kernel Wiener Filter algorithm in MATLAB. The algorithm dependes on the eigenvalue decomposition, thus only a few thousand of data samples for training dataset is applicable so far. -The following code implement
winerfilter
- The following code implements a kernel wiener filter algorithm in MATLAB.The algorithm dependes on the eigenvalue decomposition, thus only a few thousand of data samples for training dataset is applicable so far.
Gauss-SVM
- 基于Gauss 核函数SVM分类机,使用二阶几何方法训练。-Gauss kernel function SVM classification based on machine, using the geometric method of second-order training.
stprtool--SVM
- stprtool--SVM内核生产svm_mex.dll 用于其他SVM训练用-stprtool- SVM kernel production svm_mex.dll used for other SVM training with
titanium
- VC Support Vector Classification Usage: [nsv alpha bias] = svc(X,Y,ker,C) Parameters: X - Training inputs Y - Training targets ker - kernel function C - upper bound (non-separable case) nsv - number of support vectors alpha -
LibSVM-analisis
- 对libsvm 的结构参数、核函数、svm预测以及训练等几个方面进行了详细的分析-It s detailed analysis Libsvm of the structural parameters and kernel function, SVM forecasting and training and so on.
A-GA-based-feature-selection-and-parameters-optim
- Support Vector Machines, one of the new techniques for pattern classifi cation, have been widely used in many application areas. The kernel parameters setting for SVM in a training process impacts on the classifi cation accuracy. Feature
use_libsvzm
- To use precomputed kernel, you must include sample serial number as the first column of the training and testing data (assume your kernel matrix is K, # of instances is n):
ProjectPenalty
- 一种无损降维的方法论文,使用投影惩罚和核函数进行分类器的训练选择-A nondestructive method of dimensionality reduction papers, projection punishment and the kernel function classifier training options
code
- matlab程序:高斯核回带用法的支持向量机,对样本经行训练得到预测模型-matlab program: the Gaussian kernel back with the use of support vector machine prediction model of the samples by the line training
fastsvm1
- 机器学习大牛Dale Schuurmans写的多类SVMs的快速实现算法,可以自己修改核函数,通过K-fold cross validation训练得到最优参数,分类效果很好-Machine learning large cattle Dale Schuurmans write multi-class SVMs fast algorithm, can modify the kernel function, the optimal parameters through K-fold cross v
matlab-SVM
- 本资料包括实验要求文档,报告文档,训练及测试数据,matlab源代码。就给定问题,利用SVM来进行分类。SVM包括hardmargin的线性和非线性内核,softmargin的线性和非线性内核分别来分类以及评估分类准确度-a MATLAB (M-file) program to compute the discriminant functiong for the following SVMs, using the training set provided:A hard-margin SVM
373901518KPCA
- 核函数主成分分析,用于数据的特征提取,对于训练样本的降维有较好的效果-Kernel principal component analysis, feature extraction for data, which can effectively reduce the dimension of training samples, the better
matrbf
- 在MATLAB中调用核函数为rbf的svm,通过训练数据对测试数据进行分类(仅适用于二分类数据)(In MATLAB, use the kernel function rvf svm, through the training data on the test data classification (only for two categories of data))
KPCA故障检测程序(代码已优化)
- 基于核主元分析(KPCA)的工业过程故障检测,代码已优化,运行效率高,有详细的注释,附有训练数据和测试数据。(Achieves fault detection of industrial processes based on Kernel Principal Component Analysis (KPCA); the code has been optimized for high operational efficiency; detailed notes are attached with
KRR
- 核岭回归算法 输入数据集(需要分开存放训练集和测试集) 利用4重交叉验证法调参 最后输出分类准确率(Kernel ridge regression algorithm Input data set (training set and test set need to be stored separately) Parameter adjustment by 4-fold cross validation Final output classification accuracy)