资源简介
传统的SVM模型只能实现单输出,即输入多个特征,返回单一的特征。此代码实现输入多个特征输出多个特征。即多输入多输出SVM模型。
代码片段和文件信息
function RESULTS = assessment(LabelsPreLabelspar)
%
% function RESULTS = assessment(LabelsPreLabelspar)
%
% INPUTS:
%
% Labels : A vector containing the true (actual) labels for a given set of sample.
% PreLabels : A vector containing the estimated (predicted) labels for a given set of sample.
% par : ‘class‘ or ‘regress‘
%
% OUTPUTS: (all contained in struct RESULTS)
%
% ConfusionMatrix: Confusion matrix of the classification process (True labels in columns predictions in rows)
% Kappa : Estimated Cohen‘s Kappa coefficient
% OA : Overall Accuracy
% varKappa : Variance of the estimated Kappa coefficient
% Z : A basic Z-score for significance testing (considering that Kappa is normally distributed)
% CI : Confidence interval at 95% for the estimated Kappa coefficient
% Wilcoxon sign test and McNemar‘s test of significance differences
%
% Gustavo Camps-Valls 2007(c)
% gcamps@uv.es
%
% Formulae in:
% Assessing the Accuracy of Remotely Sensed Data
% by Russell G Congalton Kass Green. CRC Press
%
switch lower(par)
case {‘class‘}
Etiquetas = union(LabelsPreLabels); % Class labels (usually 123.... but can work with text labels)
NumClases = length(Etiquetas); % Number of classes
% Compute confusion matrix
ConfusionMatrix = zeros(NumClases);
for i=1:NumClases
for j=1:NumClases
ConfusionMatrix(ij) = length(find(PreLabels==Etiquetas(i) & Labels==Etiquetas(j)));
end;
end;
% Compute Overall Accuracy and Cohen‘s kappa statistic
n = sum(ConfusionMatrix(:)); % Total number of samples
PA = sum(diag(ConfusionMatrix));
OA = PA/n;
% Estimated Overall Cohen‘s Kappa (suboptimal implementation)
npj = sum(ConfusionMatrix1);
nip = sum(ConfusionMatrix2);
PE = npj*nip;
if (n*PA-PE) == 0 && (n^2-PE) == 0
% Solve indetermination
warning(‘0 divided by 0‘)
Kappa = 1;
else
Kappa = (n*PA-PE)/(n^2-PE);
end
% Cohen‘s Kappa Variance
theta1 = OA;
theta2 = PE/n^2;
theta3 = (nip‘+npj) * diag(ConfusionMatrix) / n^2;
suma4 = 0;
for i=1:NumClases
for j=1:NumClases
suma4 = suma4 + ConfusionMatrix(ij)*(nip(i) + npj(j))^2;
end;
end;
theta4 = suma4/n^3;
varKappa = ( theta1*(1-theta1)/(1-theta2)^2 + 2*(1-theta1)*(2*theta1*theta2-theta3)/(1-theta2)^3 + (1-theta1)^2*(theta4-4*theta2^2)/(1-theta2)^4 )/n;
Z = Kappa/sqrt(varKappa);
CI = [Kappa + 1.96*sqrt(varKappa) Kappa - 1.96*sqrt(varKappa)];
if NumClases==2
% Wilcoxon test at 95% confidence interval
[p1h1] = signrank(LabelsPreLabels);
if h1==0
RESULTS.WilcoxonComment = ‘The null hypothesis of both distributions come from the same median can be rejected at the 5% level.‘;
elseif h1==1
RESULTS.WilcoxonComment = ‘Th
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
目录 0 2016-02-16 07:39 msvr-2-1\
文件 2182 2016-02-16 07:37 msvr-2-1\demoMSVR.m
文件 4689 2010-09-15 09:18 msvr-2-1\assessment.m
文件 1642 2010-09-15 09:18 msvr-2-1\kernelmatrix.m
文件 3312 2016-02-16 07:39 msvr-2-1\msvr.m
文件 198 2010-09-15 09:18 msvr-2-1\scale.m
- 上一篇:外贸业务流程管理软件系统业务说明
- 下一篇:LM567的应用及红外壁障电路
相关资源
- libsvm数据格式转换
- 机器学习常用数据集(iris、wine、ab
- PSO-SVM.rar
- 全面综述:循环神经网络进展
- 国科大 模式识别与机器学习期末考查
- 基于决策树的手写体识别
- coursera 吴恩达深度机器学习 deep lear
- 美团机器学习实践.pdf
- 代价敏感支持向量机(CSSVM)
- 机器学习-泰坦尼克号船员获救
- 基于机器学习方法(SVM和NB)的MIMO天
- HOG+SVM读取样本路径批处理文件
- SVM+HOG+LBP
- svm多分类模块流程图
- Tom Mitchell《机器学习》中文word版本
- 亚马逊美食评论50万数据集(Amazon F
- SVM实现负荷预测,其中包含基本SVM,
- 机器学习之随机森林random forest算法最
- UCI机器学习数据库部分数据集iris、
- 概率图模型 原理与技术 完整版 中文
- 机器学习实战之02-k近邻算法全部源代
- 支持向量机一对一多分类
- SVM多分类代码
- LibSVM学习笔记整理
- 支持向量机大牛Vapnik的两篇论文
- 隐形眼镜数据-机器学习
- R语言实现SVM预测的代码
- 北航研究生机器学习2013年考试试题
- 信号与数据处理中的低秩模型——理
- svm和lstm用于文本分类
评论
共有 条评论