CheeseZH: Stanford University: Machine Learning Ex2:Logistic Regression
1. Sigmoid Function
In Logisttic Regression, the hypothesis is defined as:
where function g is the sigmoid function. The sigmoid function is defined as:
2.Cost function and gradient
The cost function in logistic regression is:
the gradient of the cost is a vector of the same length as θ where jth element(for j=0,1,...,n) is defined as follows:
3. Regularized Cost function and gradient
Recall that the regularized cost function in logistic regression is:
The gradient of the cost function is a vector where the jth element is defined as follows:
for j=0:
for j>=1:
Here are the code files:
ex2_data1.txt
34.62365962451697,78.0246928153624,0
30.28671076822607,43.89499752400101,0
35.84740876993872,72.90219802708364,0
60.18259938620976,86.30855209546826,1
79.0327360507101,75.3443764369103,1
45.08327747668339,56.3163717815305,0
61.10666453684766,96.51142588489624,1
75.02474556738889,46.55401354116538,1
76.09878670226257,87.42056971926803,1
84.43281996120035,43.53339331072109,1
95.86155507093572,38.22527805795094,0
75.01365838958247,30.60326323428011,0
82.30705337399482,76.48196330235604,1
69.36458875970939,97.71869196188608,1
39.53833914367223,76.03681085115882,0
53.9710521485623,89.20735013750205,1
69.07014406283025,52.74046973016765,1
67.94685547711617,46.67857410673128,0
70.66150955499435,92.92713789364831,1
76.97878372747498,47.57596364975532,1
67.37202754570876,42.83843832029179,0
89.67677575072079,65.79936592745237,1
50.534788289883,48.85581152764205,0
34.21206097786789,44.20952859866288,0
77.9240914545704,68.9723599933059,1
62.27101367004632,69.95445795447587,1
80.1901807509566,44.82162893218353,1
93.114388797442,38.80067033713209,0
61.83020602312595,50.25610789244621,0
38.78580379679423,64.99568095539578,0
61.379289447425,72.80788731317097,1
85.40451939411645,57.05198397627122,1
52.10797973193984,63.12762376881715,0
52.04540476831827,69.43286012045222,1
40.23689373545111,71.16774802184875,0
54.63510555424817,52.21388588061123,0
33.91550010906887,98.86943574220611,0
64.17698887494485,80.90806058670817,1
74.78925295941542,41.57341522824434,0
34.1836400264419,75.2377203360134,0
83.90239366249155,56.30804621605327,1
51.54772026906181,46.85629026349976,0
94.44336776917852,65.56892160559052,1
82.36875375713919,40.61825515970618,0
51.04775177128865,45.82270145776001,0
62.22267576120188,52.06099194836679,0
77.19303492601364,70.45820000180959,1
97.77159928000232,86.7278223300282,1
62.07306379667647,96.76882412413983,1
91.56497449807442,88.69629254546599,1
79.94481794066932,74.16311935043758,1
99.2725269292572,60.99903099844988,1
90.54671411399852,43.39060180650027,1
34.52451385320009,60.39634245837173,0
50.2864961189907,49.80453881323059,0
49.58667721632031,59.80895099453265,0
97.64563396007767,68.86157272420604,1
32.57720016809309,95.59854761387875,0
74.24869136721598,69.82457122657193,1
71.79646205863379,78.45356224515052,1
75.3956114656803,85.75993667331619,1
35.28611281526193,47.02051394723416,0
56.25381749711624,39.26147251058019,0
30.05882244669796,49.59297386723685,0
44.66826172480893,66.45008614558913,0
66.56089447242954,41.09209807936973,0
40.45755098375164,97.53518548909936,1
49.07256321908844,51.88321182073966,0
80.27957401466998,92.11606081344084,1
66.74671856944039,60.99139402740988,1
32.72283304060323,43.30717306430063,0
64.0393204150601,78.03168802018232,1
72.34649422579923,96.22759296761404,1
60.45788573918959,73.09499809758037,1
58.84095621726802,75.85844831279042,1
99.82785779692128,72.36925193383885,1
47.26426910848174,88.47586499559782,1
50.45815980285988,75.80985952982456,1
60.45555629271532,42.50840943572217,0
82.22666157785568,42.71987853716458,0
88.9138964166533,69.80378889835472,1
94.83450672430196,45.69430680250754,1
67.31925746917527,66.58935317747915,1
57.23870631569862,59.51428198012956,1
80.36675600171273,90.96014789746954,1
68.46852178591112,85.59430710452014,1
42.0754545384731,78.84478600148043,0
75.47770200533905,90.42453899753964,1
78.63542434898018,96.64742716885644,1
52.34800398794107,60.76950525602592,0
94.09433112516793,77.15910509073893,1
90.44855097096364,87.50879176484702,1
55.48216114069585,35.57070347228866,0
74.49269241843041,84.84513684930135,1
89.84580670720979,45.35828361091658,1
83.48916274498238,48.38028579728175,1
42.2617008099817,87.10385094025457,1
99.31500880510394,68.77540947206617,1
55.34001756003703,64.9319380069486,1
74.77589300092767,89.52981289513276,1
ex2.m
%% Machine Learning Online Class - Exercise 2: Logistic Regression
%
% Instructions
% ------------
%
% This file contains code that helps you get started on the logistic
% regression exercise. You will need to complete the following functions
% in this exericse:
%
% sigmoid.m
% costFunction.m
% predict.m
% costFunctionReg.m
%
% For this exercise, you will not need to change any code in this file,
% or any other files other than those mentioned above.
% %% Initialization
clear ; close all; clc %% Load Data
% The first two columns contains the exam scores and the third column
% contains the label. data = load('ex2data1.txt');
X = data(:, [1, 2]); y = data(:, 3); %% ==================== Part 1: Plotting ====================
% We start the exercise by first plotting the data to understand the
% the problem we are working with. fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...
'indicating (y = 0) examples.\n']); plotData(X, y); % Put some labels
hold on;
% Labels and Legend
xlabel('Exam 1 score')
ylabel('Exam 2 score') % Specified in plot order
legend('Admitted', 'Not admitted')
hold off; fprintf('\nProgram paused. Press enter to continue.\n');
pause; %% ============ Part 2: Compute Cost and Gradient ============
% In this part of the exercise, you will implement the cost and gradient
% for logistic regression. You neeed to complete the code in
% costFunction.m % Setup the data matrix appropriately, and add ones for the intercept term
[m, n] = size(X); % Add intercept term to x and X_test
X = [ones(m, 1) X]; % Initialize fitting parameters
initial_theta = zeros(n + 1, 1); % Compute and display initial cost and gradient
[cost, grad] = costFunction(initial_theta, X, y); fprintf('Cost at initial theta (zeros): %f\n', cost);
fprintf('Gradient at initial theta (zeros): \n');
fprintf(' %f \n', grad); fprintf('\nProgram paused. Press enter to continue.\n');
pause; %% ============= Part 3: Optimizing using fminunc =============
% In this exercise, you will use a built-in function (fminunc) to find the
% optimal parameters theta. % Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400); % Run fminunc to obtain the optimal theta
% This function will return theta and the cost
[theta, cost] = ...
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options); % Print theta to screen
fprintf('Cost at theta found by fminunc: %f\n', cost);
fprintf('theta: \n');
fprintf(' %f \n', theta); % Plot Boundary
plotDecisionBoundary(theta, X, y); % Put some labels
hold on;
% Labels and Legend
xlabel('Exam 1 score')
ylabel('Exam 2 score') % Specified in plot order
legend('Admitted', 'Not admitted')
hold off; fprintf('\nProgram paused. Press enter to continue.\n');
pause; %% ============== Part 4: Predict and Accuracies ==============
% After learning the parameters, you'll like to use it to predict the outcomes
% on unseen data. In this part, you will use the logistic regression model
% to predict the probability that a student with score 45 on exam 1 and
% score 85 on exam 2 will be admitted.
%
% Furthermore, you will compute the training and test set accuracies of
% our model.
%
% Your task is to complete the code in predict.m % Predict probability for a student with score 45 on exam 1
% and score 85 on exam 2 prob = sigmoid([1 45 85] * theta);
fprintf(['For a student with scores 45 and 85, we predict an admission ' ...
'probability of %f\n\n'], prob); % Compute accuracy on our training set
p = predict(theta, X); fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100); fprintf('\nProgram paused. Press enter to continue.\n');
pause;
sigmoid.m
function g = sigmoid(z)
%SIGMOID Compute sigmoid functoon
% J = SIGMOID(z) computes the sigmoid of z. % You need to return the following variables correctly
g = zeros(size(z)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix,
% vector or scalar). g = 1./(1+exp(-z)); % ============================================================= end
costFunction.m
function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % You need to return the following variables correctly
J = 0;
grad = zeros(size(theta)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
hx = sigmoid(X*theta); % m x 1
J = -1/m*(y'*log(hx)+((1-y)'*log(1-hx)));
grad = 1/m*X'*(hx-y); % ============================================================= end
predict.m
function p = predict(theta, X)
%PREDICT Predict whether the label is 0 or 1 using learned logistic
%regression parameters theta
% p = PREDICT(theta, X) computes the predictions for X using a
% threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1) m = size(X, 1); % Number of training examples % You need to return the following variables correctly
p = zeros(m, 1); % ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned logistic regression parameters.
% You should set p to a vector of 0's and 1's
% p = sigmoid(X*theta)>=0.5; % ========================================================================= end
costFunctionReg.m
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % You need to return the following variables correctly
J = 0;
grad = zeros(size(theta)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
hx = sigmoid(X*theta);
reg = lambda/(2*m)*sum(theta(2:size(theta),:).^2);
J = -1/m*(y'*log(hx)+(1-y)'*log(1-hx)) + reg;
theta(1) = 0;
grad = 1/m*X'*(hx-y)+lambda/m*theta; % ============================================================= end
CheeseZH: Stanford University: Machine Learning Ex2:Logistic Regression的更多相关文章
- CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression
(1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput t ...
- CheeseZH: Stanford University: Machine Learning Ex3: Multiclass Logistic Regression and Neural Network Prediction
Handwritten digits recognition (0-9) Multi-class Logistic Regression 1. Vectorizing Logistic Regress ...
- CheeseZH: Stanford University: Machine Learning Ex5:Regularized Linear Regression and Bias v.s. Variance
源码:https://github.com/cheesezhe/Coursera-Machine-Learning-Exercise/tree/master/ex5 Introduction: In ...
- CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)
1. Feedforward and cost function; 2.Regularized cost function: 3.Sigmoid gradient The gradient for t ...
- machine learning 之 logistic regression
整理自Adrew Ng 的 machine learning课程week3 目录: 二分类问题 模型表示 decision boundary 损失函数 多分类问题 过拟合问题和正则化 什么是过拟合 如 ...
- Machine Learning/Introducing Logistic Function
Machine Learning/Introducing Logistic Function 打算写点关于Machine Learning的东西, 正好也在cnBlogs上新开了这个博客, 也就更新在 ...
- Stanford CS229 Machine Learning by Andrew Ng
CS229 Machine Learning Stanford Course by Andrew Ng Course material, problem set Matlab code written ...
- Machine Learning #Lab1# Linear Regression
Machine Learning Lab1 打算把Andrew Ng教授的#Machine Learning#相关的6个实验一一实现了贴出来- 预计时间长度战线会拉的比較长(毕竟JOS的7级浮屠还没搞 ...
- 【Coursera - machine learning】 Linear regression with one variable-quiz
Question 1 Consider the problem of predicting how well a student does in her second year of college/ ...
随机推荐
- java后台接收json数据,报错com.alibaba.fastjson.JSONObject cannot be cast to xxx
从前台接收json封装的list数据,在后台接收时一直报错,com.alibaba.fastjson.JSONObject cannot be cast to xxx, 使用这种方式接收可以接收 @R ...
- bzoj 3996 最小割
公式推出来后想了半天没思路,居然A是01矩阵..... 如果一个问题是求最值,并那么尝试先将所有可能收益加起来,然后矛盾部分能否用最小割表达(本题有两个矛盾,第一个是选还是不选,第二个是i,j有一个不 ...
- nginx 站点代理,负载均衡
nginx服务器IP是192.168.1.201 web服务器 IP 192.168.1.200,192.168.1.199 1.主配置文件是/etc/nginx/下的nginx.conf,另外一个是 ...
- Codeforces Round #293 (Div. 2) A. Vitaly and Strings
A. Vitaly and Strings time limit per test 1 second memory limit per test 256 megabytes input standar ...
- php安装配置
Content 0.序 1.安装前准备 2.安装PHP 3.配置php-fpm 0.序 本文主要是记录php在 Centos下的安装配置 .文中如无特别说明.表示php-5.6.31代码目录. 1.安 ...
- 理解JavaScript中BOM和DOM的关系
JavaScript 有三部分构成,ECMAScript,DOM和BOM,根据宿主(浏览器)的不同,具体的表现形式也不尽相同,IE和其他的浏览器风格迥异.对象是JavaScript最重要的API,包含 ...
- PostgreSQL修改数据库目录/数据库目录迁移
说明:以9+版本为例,10+的版本只要把目录替换一下即可.迁移目录肯定是要停服的! 1.在数据库软件安装之后,初始化数据库时候,可以指定初始化时创建的数据库的默认文件路径 /usr/local/pgs ...
- Windows Azure 系列-- Azure Queue的操作
- Storage Account. 和之前介绍的Azure Table和AzureBlob一样.你须要一个StorageAccount,仅仅须要创建1次AzureStorageAccount就好了, ...
- 电感式升压转换器-AIC1896 电感式升压转换器
电感式升压转换器-AIC1896 AIC1896是一个脉冲宽度调变(Pulse-Width-Modulation;PWM)控制之升压型转换器,它可以提供一个定电流以驱动白光LED. (图五A)为升压转 ...
- spring mvc 如何传递集合参数(list,数组)
spring mvc 可以自动的帮你封装参数成为对象,不用自己手动的通过request一个一个的获取参数,但是这样自动的参数封装碰碰到了集合参数可能就需要点小技巧才可以了. 一.基础类型和引用类型有什 ...