Regularized logistic regression : mapFeature(将feature增多) and costFunctionReg

ex2_reg.m文件中的部分内容

%% =========== Part 1: Regularized Logistic Regression ============
% In this part, you are given a dataset with data points that are not
% linearly separable. However, you would still like to use logistic
% regression to classify the data points.
%
% To do so, you introduce more features to use -- in particular, you add
% polynomial features to our data matrix (similar to polynomial
% regression).
%

% Add Polynomial Features

% Note that mapFeature also adds a column of ones for us, so the intercept
% term is handled
X = mapFeature(X(:,1), X(:,2));  %调用下面的mapFeature.m文件中的mapFeature(X1,X2)函数

%将只有x1,x2feature map成一个有28个feature的6次的多项式 ,这样就能画出更复杂的decision boundary,                                            但同时也有可能带来overfitting的结果(取决于λ的值)

%  调用完后X变为118*28(118个example,28个属性,包括前面的1做为一列)的矩阵

% Initialize fitting parameters
initial_theta = zeros(size(X, 2), 1);    %initial_theta: 28*1

% Set regularization parameter lambda to 1  
lambda = 1;                            % λ=1;当λ=0时表示不正则化(No regularization ),这时会出现overfitting;当λ=100时会出现Too much regularization(Underfitting)

% Compute and display initial cost and gradient for regularized logistic
% regression
[cost, grad] = costFunctionReg(initial_theta, X, y, lambda);  %调用costFunctionReg.m文件中的costFunctionReg(theta, X, y, lambda)函数

fprintf('Cost at initial theta (zeros): %f\n', cost);    %计算initial theta (zeros)时的cost 值

fprintf('\nProgram paused. Press enter to continue.\n');
pause;

mapFeature.m文件

function out = mapFeature(X1, X2)
% MAPFEATURE Feature mapping function to polynomial features
%
% MAPFEATURE(X1, X2) maps the two input features
% to quadratic features used in the regularization exercise.
%
% Returns a new feature array with more features, comprising of
% X1, X2, X1.^2, X2.^2, X1*X2, X1*X2.^2, etc..
%
% Inputs X1, X2 must be the same size
%

degree = 6;       %map the features into all polynomial terms of x1 and x2 up to the sixth power

out = ones(size(X1(:,1)));
for i = 1:degree
for j = 0:i
out(:, end+1) = (X1.^(i-j)).*(X2.^j);
end
end

end

costFunctionReg.m文件

function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta

J = 1/m*(-1*y'*log(sigmoid(X*theta)) - (ones(1,m)-y')*log(ones(m,1)-sigmoid(X*theta)))...     
+ lambda/(2*m) * (theta(2:end,:))' * theta(2:end,:);     %Note that you should not regularize the parameter θ0.

the regularized cost function,

grad = 1/m * (X' * (sigmoid(X*theta) - y)) + (lambda/m)*theta;        % 
grad(1) = 1/m * (X(:,1))' * (sigmoid(X*theta) - y);                       %

% Note that you should not regularize the parameter θ0.

% =============================================================

end

matlab(7) Regularized logistic regression : mapFeature(将feature增多) and costFunctionReg的更多相关文章

  1. matlab(6) Regularized logistic regression : plot data(画样本图)

    Regularized logistic regression :  plot data(画样本图) ex2data2.txt 0.051267,0.69956,1-0.092742,0.68494, ...

  2. matlab(8) Regularized logistic regression : 不同的λ(0,1,10,100)值对regularization的影响,对应不同的decision boundary\ 预测新的值和计算模型的精度predict.m

    不同的λ(0,1,10,100)值对regularization的影响\ 预测新的值和计算模型的精度 %% ============= Part 2: Regularization and Accur ...

  3. machine learning(15) --Regularization:Regularized logistic regression

    Regularization:Regularized logistic regression without regularization 当features很多时会出现overfitting现象,图 ...

  4. Regularized logistic regression

    要解决的问题是,给出了具有2个特征的一堆训练数据集,从该数据的分布可以看出它们并不是非常线性可分的,因此很有必要用更高阶的特征来模拟.例如本程序中个就用到了特征值的6次方来求解. Data To be ...

  5. 编程作业2.2:Regularized Logistic regression

    题目 在本部分的练习中,您将使用正则化的Logistic回归模型来预测一个制造工厂的微芯片是否通过质量保证(QA),在QA过程中,每个芯片都会经过各种测试来保证它可以正常运行.假设你是这个工厂的产品经 ...

  6. 吴恩达机器学习笔记22-正则化逻辑回归模型(Regularized Logistic Regression)

    针对逻辑回归问题,我们在之前的课程已经学习过两种优化算法:我们首先学习了使用梯度下降法来优化代价函数

  7. Stanford机器学习---第三讲. 逻辑回归和过拟合问题的解决 logistic Regression & Regularization

    原文:http://blog.csdn.net/abcjennifer/article/details/7716281 本栏目(Machine learning)包括单参数的线性回归.多参数的线性回归 ...

  8. Andrew Ng机器学习编程作业:Logistic Regression

    编程作业文件: machine-learning-ex2 1. Logistic Regression (逻辑回归) 有之前学生的数据,建立逻辑回归模型预测,根据两次考试结果预测一个学生是否有资格被大 ...

  9. ML 逻辑回归 Logistic Regression

    逻辑回归 Logistic Regression 1 分类 Classification 首先我们来看看使用线性回归来解决分类会出现的问题.下图中,我们加入了一个训练集,产生的新的假设函数使得我们进行 ...

随机推荐

  1. 利用卷积神经网络处理cifar图像分类

    这是一个图像分类的比赛CIFAR( CIFAR-10 - Object Recognition in Images ) 首先我们需要下载数据文件,地址: http://www.cs.toronto.e ...

  2. Linux下go环境搭建

    一:先从https://golang.google.cn/dl/下载,我这边下载的是go1.13.3.linux-amd64.tar.gz: 二:将压缩包解压后得到go目录,将go目录移动到/usr/ ...

  3. 024 Android 自定义样式对话框(AlertDialog)

    1.AlertDialog介绍 AlertDialog并不需要到布局文件中创建,而是在代码中通过构造器(AlertDialog.Builder)来构造标题.图标和按钮等内容的. 常规使用步骤(具体参见 ...

  4. [转帖]TPC-C解析系列02_OceanBase如何做TPC-C测试

    TPC-C解析系列02_OceanBase如何做TPC-C测试 http://www.itpub.net/2019/10/08/3333/   导语: 蚂蚁金服自研数据库OceanBase登顶TPC- ...

  5. git config 介绍

    转载. https://blog.csdn.net/liuxiao723846/article/details/83113317 Git的三个重要配置文件分别是/etc/gitconfig,${HOM ...

  6. SDK的使用步骤

    SDK包括三种类型文件: (1).头文件(.h) (2).库文件(.lib) (3).动态库(.dll) 第一步:在项目目录中新建一个Libs文件夹,再在该文件夹中分别新建inc文件夹和lib文件夹, ...

  7. Django基础之django分页

    一.Django的内置分页器(paginator) view from django.shortcuts import render,HttpResponse # Create your views ...

  8. Python中turtle库的使用

    Turtle图形库 Turtle库是Python内置的图形化模块,属于标准库之一,位于Python安装目录的lib文件夹下,常用函数有以下几种: 画笔控制函数 penup():抬起画笔: pendow ...

  9. UOJ #7 NOI2014购票(点分治+cdq分治+斜率优化+动态规划)

    重写一遍很久以前写过的题. 考虑链上的问题.容易想到设f[i]为i到1的最少购票费用,转移有f[i]=min{f[j]+(dep[i]-dep[j])*p[i]+q[i]} (dep[i]-dep[j ...

  10. github的pull request是指什么意思?有什么用处(转)

    https://www.cnblogs.com/-walker/p/6093277.html