%第一列为 size of House(feet^2),第二列为 number of bedroom,第三列为 price of House
1 ,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
,,
 1 %  Exercise 1: Linear regression with multiple variables

%% Initialization %% ================ Part 1: Feature Normalization ================ %% Clear and Close Figures
clear ; close all; clc fprintf('Loading data ...\n'); %% Load Data
data = load('ex1data2.txt');
X = data(:, :);
y = data(:, );
m = length(y); % Print out some data points
fprintf('First 10 examples from the dataset: \n');
fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(:,:) y(:,:)]'); fprintf('Program paused. Press enter to continue.\n');
pause; % Scale features and set them to zero mean
fprintf('Normalizing Features ...\n'); [X, mu, sigma] = featureNormalize(X);
 1 %featureNormalize(X)函数实现
function [X_norm, mu, sigma] = featureNormalize(X)
X_norm = X; % X是需要正规化的矩阵
mu = zeros(, size(X, )); % 生成 1x3 的全0矩阵
sigma = zeros(, size(X, )); % 同上 % Instructions: First, for each feature dimension, compute the mean
% of the feature and subtract it from the dataset,
% storing the mean value in mu. Next, compute the
% standard deviation of each feature and divide
% each feature by it's standard deviation, storing
% the standard deviation in sigma.
%
% Note that X is a matrix where each column is a
% feature and each row is an example. You need
% to perform the normalization separately for
% each feature.
%
% Hint: You might find the 'mean' and 'std' functions useful. % std,均方差,std(X,,)求列向量方差,std(X,,)求行向量方差。 mu = mean(X, ); %求每列的均值--即一种特征的所有样本的均值
sigma = std(X); %默认同std(X,,)求列向量方差
%fprintf('Debug....\n'); disp(sigma);
i = ;
len = size(X,); %行数
while i <= len,
%对每列的所有行上的样本进行normalization(归一化):(每列的所有行-该列均值)/(该列的标准差)
X_norm(:,i) = (X(:,i) - mu(,i)) / (sigma(,i));
i = i + ;
end
 1 % Add intercept term to X
2 X = [ones(m, 1) X]; %% ================ Part : Gradient Descent ================ % Instructions: We have provided you with the following starter
% code that runs gradient descent with a particular
% learning rate (alpha).
%
% Your task is to first make sure that your functions -
% computeCost and gradientDescent already work with
% this starter code and support multiple variables.
%
% After that, try running gradient descent with
% different values of alpha and see which one gives
% you the best result.
%
% Finally, you should complete the code at the end
% to predict the price of a sq-ft, br house.
%
% Hint: By using the 'hold on' command, you can plot multiple
% graphs on the same figure.
%
% Hint: At prediction, make sure you do the same feature normalization.
% fprintf('Running gradient descent ...\n'); % Choose some alpha value
alpha = 0.03; % learning rate - 可尝试0.,0.03,0.1,0.3...
num_iters = ; % 迭代次数 % Init Theta and Run Gradient Descent
theta = zeros(, ); % 3x1的全零矩阵
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
% gradientDescentMulti()函数实现
1 function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)

% Initialize some useful values
m = length(y); % number of training examples
feature_number = size(X,); % number of feature J_history = zeros(num_iters, );
temp = zeros(feature_number, ); for iter = : num_iters
predictions = X * theta;
sqrError = (predictions - y);
for i = : feature_number % Simultneously update theta(i) (同时更新)
temp(i) = theta(i) - (alpha / m) * sum(sqrError .* X(:,i));
end for j = : feature_number
theta(j) = temp(j);
end % Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCostMulti) and gradient here.
% % ============================================================ % Save the cost J in every iteration
J_history(iter) = computeCostMulti(X, y, theta);
36 % disp(J_history(iter)); end end
 1 % Plot the convergence graph
figure;
plot(:numel(J_history), J_history, '-b', 'LineWidth', ); % '-b'--用蓝线绘制图像,线宽为2
xlabel('Number of iterations');
ylabel('Cost J'); % Display gradient descent's result
fprintf('Theta computed from gradient descent: \n');
fprintf(' %f \n', theta);
fprintf('\n');
Tip:
To compare how dierent learning learning
rates aect convergence, it's helpful to plot J for several learning rates
on the same gure. In Octave/MATLAB, this can be done by perform-
ing gradi
ent descent multiple times with a `hold on' command between
plots. Concretely, if you've tried three dierent values of alpha (you should
probably try more values than this) and stored the costs in J1, J2 and
J3, you can use the following commands to plot them on the same gure:
plot(1:50, J1(1
:50), `b');
hold on;
plot(1:50, J2(1:50), `r');
plot(1:50, J3(1:50), `k');
The nal arguments `b', `r', and `k' specify dierent colors for the
plots.
 1 % 上面的Tip实现如: 可以添加本段代码进行比较 不同的learning rate
2 figure;
3 plot(1:100, J_history(1:100), '-b', 'LineWidth', 2);
4 xlabel('Number of iterations');
5 ylabel('Cost J');
6
7 % Compare learning rate
8 hold on;
9 alpha = 0.03;
10 theta = zeros(3, 1);
11 [theta, J_history1] = gradientDescentMulti(X, y, theta, alpha, num_iters);
12 plot(1:100, J_history1(1:100), 'r', 'LineWidth', 2);
13
14 hold on;
15 alpha = 0.1;
16 theta = zeros(3, 1);
17 [theta, J_history2] = gradientDescentMulti(X, y, theta, alpha, num_iters);
18 plot(1:100, J_history2(1:100), 'g', 'LineWidth', 2);
 1 % 利用梯度下降算法预测新值
price = [, X(:)] * theta; %利用矩阵乘法--预测多特征下的price % ============================================================ fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
'(using gradient descent):\n $%f\n'], price); fprintf('Program paused. Press enter to continue.\n');
pause;
 1 %% ================ Part 3: Normal Equations ================
2 %利用正规方程预测新值(Normal Equation)
fprintf('Solving with normal equations...\n'); %% Load Data
data = csvread('ex1data2.txt');
X = data(:, :);
y = data(:, );
m = length(y); % Add intercept term to X
X = [ones(m, ) X]; % Calculate the parameters from the normal equation
theta = normalEqn(X, y);
 % normalEquation的实现
1 function [theta] = normalEqn(X, y)

theta = zeros(size(X, ), ); % Instructions: Complete the code to compute the closed form solution
% to linear regression and put the result in theta. theta = pinv(X' * X) * X' * y; end
 1 % Display normal equation's result
fprintf('Theta computed from the normal equations: \n');
fprintf(' %f \n', theta);
fprintf('\n'); % Estimate the price of a sq-ft, br house price = ;
price = [, X(:)] * theta; %利用正规方程预测新值 fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
'(using normal equations):\n $%f\n'], price);

Linear regression with multiple variables(多特征的线型回归)算法实例_梯度下降解法(Gradient DesentMulti)以及正规方程解法(Normal Equation)的更多相关文章

  1. Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)

    Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...

  2. 机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables)

    机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables) 同样是预测房价问题  如果有多个特征值 那么这种情况下  假设h表示 ...

  3. 【原】Coursera—Andrew Ng机器学习—Week 2 习题—Linear Regression with Multiple Variables 多变量线性回归

    Gradient Descent for Multiple Variables [1]多变量线性模型  代价函数 Answer:AB [2]Feature Scaling 特征缩放 Answer:D ...

  4. 机器学习 (二) 多变量线性回归 Linear Regression with Multiple Variables

    文章内容均来自斯坦福大学的Andrew Ng教授讲解的Machine Learning课程,本文是针对该课程的个人学习笔记,如有疏漏,请以原课程所讲述内容为准.感谢博主Rachel Zhang 的个人 ...

  5. ML:多变量代价函数和梯度下降(Linear Regression with Multiple Variables)

    代价函数cost function 公式: 其中,变量θ(Rn+1或者R(n+1)*1) 向量化: Octave实现: function J = computeCost(X, y, theta) %C ...

  6. Coursera machine learning 第二周 quiz 答案 Linear Regression with Multiple Variables

    https://www.coursera.org/learn/machine-learning/exam/7pytE/linear-regression-with-multiple-variables ...

  7. 机器学习之多变量线性回归(Linear Regression with multiple variables)

    1. Multiple features(多维特征) 在机器学习之单变量线性回归(Linear Regression with One Variable)我们提到过的线性回归中,我们只有一个单一特征量 ...

  8. 斯坦福机器学习视频笔记 Week2 多元线性回归 Linear Regression with Multiple Variables

    相比于week1中讨论的单变量的线性回归,多元线性回归更具有一般性,应用范围也更大,更贴近实际. Multiple Features 上面就是接上次的例子,将房价预测问题进行扩充,添加多个特征(fea ...

  9. #Week3 Linear Regression with Multiple Variables

    一.Multiple Features 这节课主要引入了一些记号,假设现在有n个特征,那么: 为了便于用矩阵处理,令\(x_0=1\): 参数\(\theta\)是一个(n+1)*1维的向量,任一个训 ...

随机推荐

  1. javascript小技巧(非常全)

    事件源对象 event.srcElement.tagName event.srcElement.type 捕获释放 event.srcElement.setCapture();  event.srcE ...

  2. 深入理解Java虚拟机之读书笔记三 内存分配策略

    一般的内存分配是指堆上的分配,但也可能经过JIT编译后被拆散为标量类型并间接地在栈上分配.对象主要分配在新生代的Eden区上,如果启动了本地线程分配缓冲,将按线程优先在TLAB上分配,少数情况下直接分 ...

  3. 我对windows消息机制的理解(参考深入浅出MFC,欢迎批评指正!!)

    以消息为基础,以事件驱动之 程序的进行依靠外部消息来驱动,即:程序不断等待任何可能的输入,然后做判断,然后再做适当的处理. 消息输入:操作系统捕获,以消息形式进入程序.(操作系统通过其USERS模块中 ...

  4. 我对自己提的几个关于cocos2dx的几个问题

    1.友元函数的定义: 2.运算符重载: 3.内存关机机制: 4.动作侦听: 5.单点触摸: 6.触摸目标判断: 7.事件传递: 8.多点触摸: 9.加速传感器: 10.物理按键交互: 11.绘图API ...

  5. [POJ3277]City Horizon

    [POJ3277]City Horizon 试题描述 Farmer John has taken his cows on a trip to the city! As the sun sets, th ...

  6. U盘安装中标麒麟服务器操作系统 一 (NeoKylin 6.5)

    U盘安装中标麒麟服务器操作系统(NeoKylin 6.5) 首先需要下载中标麒麟服务器操作系统的iso镜像.我这里的是NeoKylin Linux A 6.5.iso 因为超过了4GB,百度网盘不支持 ...

  7. phpcmsV9.5.8整合百度编辑器Ueditor1.4.3教程

    最近在搞phpcms视频功能,官方的视频功能实在是坑,刚开始是想将优酷的上传功能集成到ckeditor,在coding上有个项目,上传已经集成好了,还没有做上传后视频的获取和显示 项目地址:https ...

  8. MySQL客户端Workbench

    MySQL客户端除了Navicat之外,还有官方推出的MySQL Workbench,能看到数据库包含的存储过程,而Navicate不能. 下载链接: 32位:http://cdn.mysql.com ...

  9. (转) Lambda表达式中的表达式lambda和语句lambda区别

    Lambda表达式可分为表达式lambda和语句lambda 表达式lambda:表达式位于 => 运算符右侧的lambda表达式称为表达式lambda (input parameters) = ...

  10. Oracle开发之窗口函数 rows between unbounded preceding and current row

    目录=========================================1.窗口函数简介2.窗口函数示例-全统计3.窗口函数进阶-滚动统计(累积/均值)4.窗口函数进阶-根据时间范围统计 ...