|
SVM支持向量机底层代码讲解:
百度网盘视频链接:
视频链接:http://pan.baidu.com/s/1hsknqE4
录制的视频是算法底层原理讲解,底层代码实现,方便大家真正掌握算法实质,开发出更加出色的算法。录制视频的初衷是:避免读者朋友利用大把时间学习已有常见算法,本系列视频旨在让读者朋友快速深入了解这些常见算法原理,有更多的时间去研究更加高大上算法(价值)。
具体链接在halcom.cn论坛,联系人QQ:3283892722
该论坛是一个学习交流平台,我会逐一的和大家分享学习。
欢迎大家录制视频,并提交给我,我来设置视频,你可在论坛进行打赏分享。
视频专用播放器:http://halcom.cn/forum.php?mod=viewthread&tid=258&extra=page%3D1
SVM支持向量机底层代码如下:运行环境:win7+32bit+matlab2014a
x:400x2
y:400x1
主程序如下:- % SVM 支持向量机
- clear all;clc;
- data=csvread('LinearlySeprerableData.csv');
- data(:,1:end-1)=zscore(data(:,1:end-1)); % 归一化处理
- x = data(:,1:end-1);
- y = data(:,end);
- N = length(y);
- % C:折中经验风险和置信风险的,C越大,置信风险越大,经验风险越小;
- C = 0.5; % Concluded after Cross-Validation,惩罚系数
- tol = 10e-5; % 损失系数
- alpha = zeros(N,1); % 初始化的alpha
- % 权值求解
- tic
- [alpha, weight] = svm_core_2( x,y,C );
- toc
- % Bias 阈值
- bias =mean( y - x*weight')
- % Support Vectors
- disp('Number of support Vectors : ')
- disp(N)
- Xsupport=x(alpha>0,:);
- Ysupport=y(alpha>0,:);
- length(Ysupport)
- %% Accuracy and F-measure
- x=data(:,1:end-1);
- y=data(:,end);
- fx=sign(weight*x'+bias)'; % 符号函数
- [~, Accuracy, F_measure ] = confusionMatrix( y, fx ) % 计算分类预测精度
- %% Plotting the Decision Boundry
- figure(1),
- hold on
- scatter(x(y==1,1),x(y==1,2),'b')
- scatter(x(y==-1,1),x(y==-1,2),'r')
- scatter(Xsupport(Ysupport==1,1),Xsupport(Ysupport==1,2),'.b')
- scatter(Xsupport(Ysupport==-1,1),Xsupport(Ysupport==-1,2),'.r')
- x1 = -2:0.01:2;
- fn=((-bias-weight(1)*x1)/weight(2));
- fn1=((-1-bias-weight(1)*x1)/weight(2));
- fn2=((1-bias-weight(1)*x1)/weight(2));
- plot(x1,fn,'r','Linewidth',2);
- plot(x1,fn1,'Linewidth',1);
- plot(x1,fn2,'Linewidth',1);
- axis([-2 2 -2 2])
- xlabel ('Positive Class: blue, Negative Class: red')
- hold off
复制代码 core函数如下:
- function [alpha, weight] = svm_core_2( x,y,C )
- % 输入:
- % x是输入样本,MxN
- % y是输出样本,Mx1(二分类,-1和1)
- % C是惩罚系数
- tol = 10e-5; % 损失系数
- N=size(y,1); % 样本数量
- alpha = zeros(N,1); % 初始化的alpha
- weight = zeros(1, size(x,2)); % 初始化的weight
- while(1)
- weight = (alpha.*y)'*x;
- for i=1:N
- Ei=sum(alpha.*y.* kernel_fun(x,x(i,:)) )-y(i);
- for j=[1:i-1,i+1:N]
- Ej=sum(alpha.*y.* kernel_fun(x,x(j,:)) )-y(j);
-
- % eta = 2*x(j,:)*x(i,:)'-x(i,:)*x(i,:)'-x(j,:)*x(j,:)';
- eta = 2*kernel_fun(x(j,:),x(i,:)) - kernel_fun(x(i,:),x(i,:)) - kernel_fun(x(j,:),x(j,:));
- % if eta>=0
- % continue
- % end
-
- alpha(j)=alpha(j)-( y(j)*(Ei-Ej) )/eta;
- if alpha(j) > C
- alpha(j) = C;
- end
- if alpha(j) < 0
- alpha(j) = 0;
- end
- end
- end
-
- % Weights 权值
- % weight(1)=sum(alpha.*y.*x(:,1));
- % weight(2)=sum(alpha.*y.*x(:,2));
- weight_new = (alpha.*y)'*x;
- if norm(weight_new-weight,2) < tol
- break;
- end
- weight = weight_new;
- end
复制代码
参考链接:
【1】SVR回归(线性,多项式、RBF)预测模型
【2】SVM算法之SMO算法
【3】Python底层SVR代码分享:http://pan.baidu.com/s/1kU773lt
【4】MATLAB底层SVR代码分享:http://pan.baidu.com/s/1pLG3dBd
【5】MATLAB底层SVC、SVR代码分享:http://pan.baidu.com/s/1dFEL6vB
|
|