1. Sigmoid Function

In Logisttic Regression, the hypothesis is defined as:

where function g is the sigmoid function. The sigmoid function is defined as:

2.Cost function and gradient

The cost function in logistic regression is:

the gradient of the cost is a vector of the same length as θ  where jth element(for j=0,1,...,n) is defined as follows:

3. Regularized Cost function and gradient

Recall that the regularized cost function in logistic regression is:

The gradient of the cost function is a vector where the jth element is defined as follows:

for j=0:

for j>=1:

Here are the code files:

ex2_data1.txt

34.62365962451697,78.0246928153624,0
30.28671076822607,43.89499752400101,0
35.84740876993872,72.90219802708364,0
60.18259938620976,86.30855209546826,1
79.0327360507101,75.3443764369103,1
45.08327747668339,56.3163717815305,0
61.10666453684766,96.51142588489624,1
75.02474556738889,46.55401354116538,1
76.09878670226257,87.42056971926803,1
84.43281996120035,43.53339331072109,1
95.86155507093572,38.22527805795094,0
75.01365838958247,30.60326323428011,0
82.30705337399482,76.48196330235604,1
69.36458875970939,97.71869196188608,1
39.53833914367223,76.03681085115882,0
53.9710521485623,89.20735013750205,1
69.07014406283025,52.74046973016765,1
67.94685547711617,46.67857410673128,0
70.66150955499435,92.92713789364831,1
76.97878372747498,47.57596364975532,1
67.37202754570876,42.83843832029179,0
89.67677575072079,65.79936592745237,1
50.534788289883,48.85581152764205,0
34.21206097786789,44.20952859866288,0
77.9240914545704,68.9723599933059,1
62.27101367004632,69.95445795447587,1
80.1901807509566,44.82162893218353,1
93.114388797442,38.80067033713209,0
61.83020602312595,50.25610789244621,0
38.78580379679423,64.99568095539578,0
61.379289447425,72.80788731317097,1
85.40451939411645,57.05198397627122,1
52.10797973193984,63.12762376881715,0
52.04540476831827,69.43286012045222,1
40.23689373545111,71.16774802184875,0
54.63510555424817,52.21388588061123,0
33.91550010906887,98.86943574220611,0
64.17698887494485,80.90806058670817,1
74.78925295941542,41.57341522824434,0
34.1836400264419,75.2377203360134,0
83.90239366249155,56.30804621605327,1
51.54772026906181,46.85629026349976,0
94.44336776917852,65.56892160559052,1
82.36875375713919,40.61825515970618,0
51.04775177128865,45.82270145776001,0
62.22267576120188,52.06099194836679,0
77.19303492601364,70.45820000180959,1
97.77159928000232,86.7278223300282,1
62.07306379667647,96.76882412413983,1
91.56497449807442,88.69629254546599,1
79.94481794066932,74.16311935043758,1
99.2725269292572,60.99903099844988,1
90.54671411399852,43.39060180650027,1
34.52451385320009,60.39634245837173,0
50.2864961189907,49.80453881323059,0
49.58667721632031,59.80895099453265,0
97.64563396007767,68.86157272420604,1
32.57720016809309,95.59854761387875,0
74.24869136721598,69.82457122657193,1
71.79646205863379,78.45356224515052,1
75.3956114656803,85.75993667331619,1
35.28611281526193,47.02051394723416,0
56.25381749711624,39.26147251058019,0
30.05882244669796,49.59297386723685,0
44.66826172480893,66.45008614558913,0
66.56089447242954,41.09209807936973,0
40.45755098375164,97.53518548909936,1
49.07256321908844,51.88321182073966,0
80.27957401466998,92.11606081344084,1
66.74671856944039,60.99139402740988,1
32.72283304060323,43.30717306430063,0
64.0393204150601,78.03168802018232,1
72.34649422579923,96.22759296761404,1
60.45788573918959,73.09499809758037,1
58.84095621726802,75.85844831279042,1
99.82785779692128,72.36925193383885,1
47.26426910848174,88.47586499559782,1
50.45815980285988,75.80985952982456,1
60.45555629271532,42.50840943572217,0
82.22666157785568,42.71987853716458,0
88.9138964166533,69.80378889835472,1
94.83450672430196,45.69430680250754,1
67.31925746917527,66.58935317747915,1
57.23870631569862,59.51428198012956,1
80.36675600171273,90.96014789746954,1
68.46852178591112,85.59430710452014,1
42.0754545384731,78.84478600148043,0
75.47770200533905,90.42453899753964,1
78.63542434898018,96.64742716885644,1
52.34800398794107,60.76950525602592,0
94.09433112516793,77.15910509073893,1
90.44855097096364,87.50879176484702,1
55.48216114069585,35.57070347228866,0
74.49269241843041,84.84513684930135,1
89.84580670720979,45.35828361091658,1
83.48916274498238,48.38028579728175,1
42.2617008099817,87.10385094025457,1
99.31500880510394,68.77540947206617,1
55.34001756003703,64.9319380069486,1
74.77589300092767,89.52981289513276,1

ex2.m

 %% Machine Learning Online Class - Exercise 2: Logistic Regression
%
% Instructions
% ------------
%
% This file contains code that helps you get started on the logistic
% regression exercise. You will need to complete the following functions
% in this exericse:
%
% sigmoid.m
% costFunction.m
% predict.m
% costFunctionReg.m
%
% For this exercise, you will not need to change any code in this file,
% or any other files other than those mentioned above.
% %% Initialization
clear ; close all; clc %% Load Data
% The first two columns contains the exam scores and the third column
% contains the label. data = load('ex2data1.txt');
X = data(:, [1, 2]); y = data(:, 3); %% ==================== Part 1: Plotting ====================
% We start the exercise by first plotting the data to understand the
% the problem we are working with. fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...
'indicating (y = 0) examples.\n']); plotData(X, y); % Put some labels
hold on;
% Labels and Legend
xlabel('Exam 1 score')
ylabel('Exam 2 score') % Specified in plot order
legend('Admitted', 'Not admitted')
hold off; fprintf('\nProgram paused. Press enter to continue.\n');
pause; %% ============ Part 2: Compute Cost and Gradient ============
% In this part of the exercise, you will implement the cost and gradient
% for logistic regression. You neeed to complete the code in
% costFunction.m % Setup the data matrix appropriately, and add ones for the intercept term
[m, n] = size(X); % Add intercept term to x and X_test
X = [ones(m, 1) X]; % Initialize fitting parameters
initial_theta = zeros(n + 1, 1); % Compute and display initial cost and gradient
[cost, grad] = costFunction(initial_theta, X, y); fprintf('Cost at initial theta (zeros): %f\n', cost);
fprintf('Gradient at initial theta (zeros): \n');
fprintf(' %f \n', grad); fprintf('\nProgram paused. Press enter to continue.\n');
pause; %% ============= Part 3: Optimizing using fminunc =============
% In this exercise, you will use a built-in function (fminunc) to find the
% optimal parameters theta. % Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400); % Run fminunc to obtain the optimal theta
% This function will return theta and the cost
[theta, cost] = ...
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options); % Print theta to screen
fprintf('Cost at theta found by fminunc: %f\n', cost);
fprintf('theta: \n');
fprintf(' %f \n', theta); % Plot Boundary
plotDecisionBoundary(theta, X, y); % Put some labels
hold on;
% Labels and Legend
xlabel('Exam 1 score')
ylabel('Exam 2 score') % Specified in plot order
legend('Admitted', 'Not admitted')
hold off; fprintf('\nProgram paused. Press enter to continue.\n');
pause; %% ============== Part 4: Predict and Accuracies ==============
% After learning the parameters, you'll like to use it to predict the outcomes
% on unseen data. In this part, you will use the logistic regression model
% to predict the probability that a student with score 45 on exam 1 and
% score 85 on exam 2 will be admitted.
%
% Furthermore, you will compute the training and test set accuracies of
% our model.
%
% Your task is to complete the code in predict.m % Predict probability for a student with score 45 on exam 1
% and score 85 on exam 2 prob = sigmoid([1 45 85] * theta);
fprintf(['For a student with scores 45 and 85, we predict an admission ' ...
'probability of %f\n\n'], prob); % Compute accuracy on our training set
p = predict(theta, X); fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100); fprintf('\nProgram paused. Press enter to continue.\n');
pause;

sigmoid.m

 function g = sigmoid(z)
%SIGMOID Compute sigmoid functoon
% J = SIGMOID(z) computes the sigmoid of z. % You need to return the following variables correctly
g = zeros(size(z)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix,
% vector or scalar). g = 1./(1+exp(-z)); % ============================================================= end

costFunction.m

 function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % You need to return the following variables correctly
J = 0;
grad = zeros(size(theta)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
hx = sigmoid(X*theta); % m x 1
J = -1/m*(y'*log(hx)+((1-y)'*log(1-hx)));
grad = 1/m*X'*(hx-y); % ============================================================= end

predict.m

 function p = predict(theta, X)
%PREDICT Predict whether the label is 0 or 1 using learned logistic
%regression parameters theta
% p = PREDICT(theta, X) computes the predictions for X using a
% threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1) m = size(X, 1); % Number of training examples % You need to return the following variables correctly
p = zeros(m, 1); % ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned logistic regression parameters.
% You should set p to a vector of 0's and 1's
% p = sigmoid(X*theta)>=0.5; % ========================================================================= end

costFunctionReg.m

 function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % You need to return the following variables correctly
J = 0;
grad = zeros(size(theta)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
hx = sigmoid(X*theta);
reg = lambda/(2*m)*sum(theta(2:size(theta),:).^2);
J = -1/m*(y'*log(hx)+(1-y)'*log(1-hx)) + reg;
theta(1) = 0;
grad = 1/m*X'*(hx-y)+lambda/m*theta; % ============================================================= end

CheeseZH: Stanford University: Machine Learning Ex2:Logistic Regression的更多相关文章

  1. CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression

    (1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput t ...

  2. CheeseZH: Stanford University: Machine Learning Ex3: Multiclass Logistic Regression and Neural Network Prediction

    Handwritten digits recognition (0-9) Multi-class Logistic Regression 1. Vectorizing Logistic Regress ...

  3. CheeseZH: Stanford University: Machine Learning Ex5:Regularized Linear Regression and Bias v.s. Variance

    源码:https://github.com/cheesezhe/Coursera-Machine-Learning-Exercise/tree/master/ex5 Introduction: In ...

  4. CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)

    1. Feedforward and cost function; 2.Regularized cost function: 3.Sigmoid gradient The gradient for t ...

  5. machine learning 之 logistic regression

    整理自Adrew Ng 的 machine learning课程week3 目录: 二分类问题 模型表示 decision boundary 损失函数 多分类问题 过拟合问题和正则化 什么是过拟合 如 ...

  6. Machine Learning/Introducing Logistic Function

    Machine Learning/Introducing Logistic Function 打算写点关于Machine Learning的东西, 正好也在cnBlogs上新开了这个博客, 也就更新在 ...

  7. Stanford CS229 Machine Learning by Andrew Ng

    CS229 Machine Learning Stanford Course by Andrew Ng Course material, problem set Matlab code written ...

  8. Machine Learning #Lab1# Linear Regression

    Machine Learning Lab1 打算把Andrew Ng教授的#Machine Learning#相关的6个实验一一实现了贴出来- 预计时间长度战线会拉的比較长(毕竟JOS的7级浮屠还没搞 ...

  9. 【Coursera - machine learning】 Linear regression with one variable-quiz

    Question 1 Consider the problem of predicting how well a student does in her second year of college/ ...

随机推荐

  1. Apache2.4使用require指令进行访问控制--允许或限制IP访问/通过User-Agent禁止不友好网络爬虫

    从Apache2.2.X到Apache2.4.X,在配置上稍微有点不同,需要特别注意.现在记录下关于访问控制的配置. 经过苦苦搜索,终于配置成功.参考了这篇文章:http://www.cnblogs. ...

  2. 【BZOJ】1864: [Zjoi2006]三色二叉树

    1864: [Zjoi2006]三色二叉树 Time Limit: 1 Sec  Memory Limit: 64 MBSubmit: 1295  Solved: 961[Submit][Status ...

  3. bzoj 3252: 攻略 -- 长链剖分+贪心

    3252: 攻略 Time Limit: 10 Sec  Memory Limit: 128 MB Description 题目简述:树版[k取方格数]   众所周知,桂木桂马是攻略之神,开启攻略之神 ...

  4. Loj10166 数字游戏2

    题目描述 由于科协里最近真的很流行数字游戏,某人又命名了一种取模数,这种数字必须满足各位数字之和 modN 为 000.现在大家又要玩游戏了,指定一个整数闭区间 [a,b][a,b][a,b],问这个 ...

  5. HDU 3535 AreYouBusy 经典混合背包

    AreYouBusy Time Limit : 2000/1000ms (Java/Other)   Memory Limit : 32768/32768K (Java/Other) Total Su ...

  6. Python学习笔记(五)—列表的学习

    总结内容: 1.list的定义 2.list的取值 3.list数据的增加 4.list数据的删除 5.list数据的修改 6.list数据的查询 7.list方法的介绍 8.list的合并 9.多维 ...

  7. .net mvc控制器传递方法到视图

    很多人都是在视图里面定义方法,然后再使用.我个人也是这么干的.但是为了验证是否可以将方法从控制器传递到视图,所以做了个测试.结果真的可以.原理是利用了委托(delegate),因为委托本身就是一种类型 ...

  8. Win8开机直接进桌面方法

    最新的Win8系统由于新增开始屏幕(UI)界面,专门为触摸设备准备,并且很多喜欢尝鲜的电脑爱好者朋友在我们传统的电脑上安装了Win8系统,不少PC用户开始都不喜欢Win8开机后进入UI界面而非传统的电 ...

  9. ZOJ 2112 Dynamic Rankings (动态第k大,树状数组套主席树)

    Dynamic Rankings Time Limit: 10 Seconds      Memory Limit: 32768 KB The Company Dynamic Rankings has ...

  10. Oracle 12c on Solaris 10 安装文档

    http://www.orasql.com/blog/archives/2013/08/20/12c_solaris.htm