Hello Mat

 找回密码
 立即注册
查看: 6057|回复: 0

Logistic回归

[复制链接]

1323

主题

1551

帖子

0

金钱

管理员

Rank: 9Rank: 9Rank: 9

积分
22647
发表于 2017-9-28 22:31:42 | 显示全部楼层 |阅读模式
Logistic回归
参考Softmax回归

以数字图像为例进行研究,X为500x400的矩阵,500表示500张图像,每个图像20x20;y为500x1的矩阵,标签为1:10;主函数为:
  1. clc,clear,close all
  2. load('Xy.mat')
  3. % X为500x400的矩阵,500表示500张图像,每个图像20x20;
  4. % y为500x1的矩阵,标签为1:10;
  5. lambda = 0.1;  % lambda为正则化参数抑制过拟合
  6. [all_theta] = oneVsAll(X, y, num_labels, lambda);

  7. % 预测
  8. % pred = predictOneVsAll(all_theta, X);
  9. m = size(X, 1);   % m = 500
  10. num_labels = size(all_theta, 1);   % num_labels = 10
  11. pred = zeros(size(X, 1), 1);
  12. % Add ones to the X data matrix
  13. X = [ones(m, 1) X];
  14. [k, pred] = max(X * all_theta' , [], 2);
复制代码
多分类优化theta,
  1. function [all_theta] = oneVsAll(X, y, num_labels, lambda)
  2. % 多类分类回归
  3. % 当训练数字1时,数字1为一类1,则其它样本数字均为另一类0
  4. % 当训练数字2时,数字2为一类1,则其它样本数字均为另一类0
  5. % 当训练数字3时,数字3为一类1,则其它样本数字均为另一类0
  6. % ……
  7. % 当训练数字10时,10表示0,数字10为一类1,则其它样本数字均为另一类0

  8. m = size(X, 1);  % m = 500 样本数
  9. n = size(X, 2);  % n = 400 特征数

  10. % 初始化 10x401
  11. all_theta = zeros(num_labels, n + 1);

  12. % 增加直流分量
  13. X = [ones(m, 1), X];

  14. % Set Initial theta
  15. initial_theta = zeros(n + 1, 1);
  16. %
  17. % % Set options for fminunc
  18. options = optimset('GradObj', 'on', 'MaxIter', 50);
  19. %
  20. %  % Run fmincg to obtain the optimal theta
  21. %  % This function will return theta and the cost
  22. for iter = 1:num_labels
  23.     % J最小时, theta的取值
  24.     all_theta(iter,:)= fmincg (@(t)(lrCostFunction(t, X, (y == iter), lambda)), initial_theta, options);
  25. end
复制代码

Logistic回归损失函数如下:
  1. function [J, grad] = lrCostFunction(theta, X, y, lambda)
  2. % Compute cost and gradient for logistic regression with regularization

  3. % Initialize some useful values
  4. m = length(y); % number of training examples

  5. % You need to return the following variables correctly
  6. J = 0;
  7. grad = zeros(size(theta));
  8. %  grad = grad + YOUR_CODE_HERE (using the temp variable)

  9. h = sigmoid(X*theta);
  10. A = theta(2:size(theta),1)
  11. J = ((-y)'*log(h)-(1-y)'*log(1-h))/m + lambda/(2*m)*sum(A.^2);

  12. % calculate grads
  13. grad = (X'*(h - y))/m + lambda*(1/m)*theta;
  14. grad(1)= grad(1) - lambda*(1/m)*theta(1);
  15. % ========================================
  16. grad = grad(:);

  17. end
复制代码
参考:
【1】Softmax回归





算法QQ  3283892722
群智能算法链接http://halcom.cn/forum.php?mod=forumdisplay&fid=73
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

Python|Opencv|MATLAB|Halcom.cn ( 蜀ICP备16027072号 )

GMT+8, 2024-11-22 22:46 , Processed in 0.208136 second(s), 22 queries .

Powered by Discuz! X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回复 返回顶部 返回列表