Linear regression with multiple variables(多特征的线型回归)算法实例_梯度下降解法(Gradient DesentMulti)以及正规方程解法(Normal Equation)
- %第一列为 size of House(feet^2),第二列为 number of bedroom,第三列为 price of House
1 ,,- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- ,,
- 1 % Exercise 1: Linear regression with multiple variables
- %% Initialization
- %% ================ Part 1: Feature Normalization ================
- %% Clear and Close Figures
- clear ; close all; clc
- fprintf('Loading data ...\n');
- %% Load Data
- data = load('ex1data2.txt');
- X = data(:, :);
- y = data(:, );
- m = length(y);
- % Print out some data points
- fprintf('First 10 examples from the dataset: \n');
- fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(:,:) y(:,:)]');
- fprintf('Program paused. Press enter to continue.\n');
- pause;
- % Scale features and set them to zero mean
- fprintf('Normalizing Features ...\n');
- [X, mu, sigma] = featureNormalize(X);
- 1 %featureNormalize(X)函数实现
- function [X_norm, mu, sigma] = featureNormalize(X)
- X_norm = X; % X是需要正规化的矩阵
- mu = zeros(, size(X, )); % 生成 1x3 的全0矩阵
- sigma = zeros(, size(X, )); % 同上
- % Instructions: First, for each feature dimension, compute the mean
- % of the feature and subtract it from the dataset,
- % storing the mean value in mu. Next, compute the
- % standard deviation of each feature and divide
- % each feature by it's standard deviation, storing
- % the standard deviation in sigma.
- %
- % Note that X is a matrix where each column is a
- % feature and each row is an example. You need
- % to perform the normalization separately for
- % each feature.
- %
- % Hint: You might find the 'mean' and 'std' functions useful.
- % std,均方差,std(X,,)求列向量方差,std(X,,)求行向量方差。
- mu = mean(X, ); %求每列的均值--即一种特征的所有样本的均值
- sigma = std(X); %默认同std(X,,)求列向量方差
- %fprintf('Debug....\n'); disp(sigma);
- i = ;
- len = size(X,); %行数
- while i <= len,
- %对每列的所有行上的样本进行normalization(归一化):(每列的所有行-该列均值)/(该列的标准差)
- X_norm(:,i) = (X(:,i) - mu(,i)) / (sigma(,i));
- i = i + ;
- end
- 1 % Add intercept term to X
- 2 X = [ones(m, 1) X];
- %% ================ Part : Gradient Descent ================
- % Instructions: We have provided you with the following starter
- % code that runs gradient descent with a particular
- % learning rate (alpha).
- %
- % Your task is to first make sure that your functions -
- % computeCost and gradientDescent already work with
- % this starter code and support multiple variables.
- %
- % After that, try running gradient descent with
- % different values of alpha and see which one gives
- % you the best result.
- %
- % Finally, you should complete the code at the end
- % to predict the price of a sq-ft, br house.
- %
- % Hint: By using the 'hold on' command, you can plot multiple
- % graphs on the same figure.
- %
- % Hint: At prediction, make sure you do the same feature normalization.
- %
- fprintf('Running gradient descent ...\n');
- % Choose some alpha value
- alpha = 0.03; % learning rate - 可尝试0.,0.03,0.1,0.3...
- num_iters = ; % 迭代次数
- % Init Theta and Run Gradient Descent
- theta = zeros(, ); % 3x1的全零矩阵
- [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
- % gradientDescentMulti()函数实现
1 function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)- % Initialize some useful values
- m = length(y); % number of training examples
- feature_number = size(X,); % number of feature
- J_history = zeros(num_iters, );
- temp = zeros(feature_number, );
- for iter = : num_iters
- predictions = X * theta;
- sqrError = (predictions - y);
- for i = : feature_number % Simultneously update theta(i) (同时更新)
- temp(i) = theta(i) - (alpha / m) * sum(sqrError .* X(:,i));
- end
- for j = : feature_number
- theta(j) = temp(j);
- end
- % Instructions: Perform a single gradient step on the parameter vector
- % theta.
- %
- % Hint: While debugging, it can be useful to print out the values
- % of the cost function (computeCostMulti) and gradient here.
- %
- % ============================================================
- % Save the cost J in every iteration
- J_history(iter) = computeCostMulti(X, y, theta);
- 36 % disp(J_history(iter));
- end
- end
- 1 % Plot the convergence graph
- figure;
- plot(:numel(J_history), J_history, '-b', 'LineWidth', ); % '-b'--用蓝线绘制图像,线宽为2
- xlabel('Number of iterations');
- ylabel('Cost J');
- % Display gradient descent's result
- fprintf('Theta computed from gradient descent: \n');
- fprintf(' %f \n', theta);
- fprintf('\n');
- Tip:
To compare how dierent learning learning- rates aect convergence, it's helpful to plot J for several learning rates
- on the same gure. In Octave/MATLAB, this can be done by perform-
- ing gradient descent multiple times with a `hold on' command between
- plots. Concretely, if you've tried three dierent values of alpha (you should
- probably try more values than this) and stored the costs in J1, J2 and
- J3, you can use the following commands to plot them on the same gure:
- plot(1:50, J1(1:50), `b');
- hold on;
- plot(1:50, J2(1:50), `r');
- plot(1:50, J3(1:50), `k');
- The nal arguments `b', `r', and `k' specify dierent colors for the
- plots.
- 1 % 上面的Tip实现如: 可以添加本段代码进行比较 不同的learning rate
- 2 figure;
- 3 plot(1:100, J_history(1:100), '-b', 'LineWidth', 2);
- 4 xlabel('Number of iterations');
- 5 ylabel('Cost J');
- 6
- 7 % Compare learning rate
- 8 hold on;
- 9 alpha = 0.03;
- 10 theta = zeros(3, 1);
- 11 [theta, J_history1] = gradientDescentMulti(X, y, theta, alpha, num_iters);
- 12 plot(1:100, J_history1(1:100), 'r', 'LineWidth', 2);
- 13
- 14 hold on;
- 15 alpha = 0.1;
- 16 theta = zeros(3, 1);
- 17 [theta, J_history2] = gradientDescentMulti(X, y, theta, alpha, num_iters);
- 18 plot(1:100, J_history2(1:100), 'g', 'LineWidth', 2);
- 1 % 利用梯度下降算法预测新值
- price = [, X(:)] * theta; %利用矩阵乘法--预测多特征下的price
- % ============================================================
- fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
- '(using gradient descent):\n $%f\n'], price);
- fprintf('Program paused. Press enter to continue.\n');
- pause;
- 1 %% ================ Part 3: Normal Equations ================
- 2 %利用正规方程预测新值(Normal Equation)
- fprintf('Solving with normal equations...\n');
- %% Load Data
- data = csvread('ex1data2.txt');
- X = data(:, :);
- y = data(:, );
- m = length(y);
- % Add intercept term to X
- X = [ones(m, ) X];
- % Calculate the parameters from the normal equation
- theta = normalEqn(X, y);
- % normalEquation的实现
1 function [theta] = normalEqn(X, y)- theta = zeros(size(X, ), );
- % Instructions: Complete the code to compute the closed form solution
- % to linear regression and put the result in theta.
- theta = pinv(X' * X) * X' * y;
- end
- 1 % Display normal equation's result
- fprintf('Theta computed from the normal equations: \n');
- fprintf(' %f \n', theta);
- fprintf('\n');
- % Estimate the price of a sq-ft, br house
- price = ;
- price = [, X(:)] * theta; %利用正规方程预测新值
- fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
- '(using normal equations):\n $%f\n'], price);
Linear regression with multiple variables(多特征的线型回归)算法实例_梯度下降解法(Gradient DesentMulti)以及正规方程解法(Normal Equation)的更多相关文章
- Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)
Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...
- 机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables)
机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables) 同样是预测房价问题 如果有多个特征值 那么这种情况下 假设h表示 ...
- 【原】Coursera—Andrew Ng机器学习—Week 2 习题—Linear Regression with Multiple Variables 多变量线性回归
Gradient Descent for Multiple Variables [1]多变量线性模型 代价函数 Answer:AB [2]Feature Scaling 特征缩放 Answer:D ...
- 机器学习 (二) 多变量线性回归 Linear Regression with Multiple Variables
文章内容均来自斯坦福大学的Andrew Ng教授讲解的Machine Learning课程,本文是针对该课程的个人学习笔记,如有疏漏,请以原课程所讲述内容为准.感谢博主Rachel Zhang 的个人 ...
- ML:多变量代价函数和梯度下降(Linear Regression with Multiple Variables)
代价函数cost function 公式: 其中,变量θ(Rn+1或者R(n+1)*1) 向量化: Octave实现: function J = computeCost(X, y, theta) %C ...
- Coursera machine learning 第二周 quiz 答案 Linear Regression with Multiple Variables
https://www.coursera.org/learn/machine-learning/exam/7pytE/linear-regression-with-multiple-variables ...
- 机器学习之多变量线性回归(Linear Regression with multiple variables)
1. Multiple features(多维特征) 在机器学习之单变量线性回归(Linear Regression with One Variable)我们提到过的线性回归中,我们只有一个单一特征量 ...
- 斯坦福机器学习视频笔记 Week2 多元线性回归 Linear Regression with Multiple Variables
相比于week1中讨论的单变量的线性回归,多元线性回归更具有一般性,应用范围也更大,更贴近实际. Multiple Features 上面就是接上次的例子,将房价预测问题进行扩充,添加多个特征(fea ...
- #Week3 Linear Regression with Multiple Variables
一.Multiple Features 这节课主要引入了一些记号,假设现在有n个特征,那么: 为了便于用矩阵处理,令\(x_0=1\): 参数\(\theta\)是一个(n+1)*1维的向量,任一个训 ...
随机推荐
- 什么是H标签?H1,H2,H3标签?以及和strong标签使用的方法及重要性
大家都知道,seo的一个很重要的一点就是要把网站做的条理清晰,让搜索引擎很容易的读明白,这个条理清晰不仅体现在网站的物理路径,url等地 方.在<h1><h2><h3&g ...
- java之BASE64加解密
1.简介 Base64是网络上最常见的用于传输8Bit字节代码的编码方式之一,采用Base64编码具有不可读性,即所编码的数据不会被人用肉眼所直接看到. 注:位于jdk的java.util包中. 2. ...
- Oracle 数据库1046事件
例子: session 2: SQL> connect test/test Connected. select * from v$mystat where rownum=1; 143 selec ...
- h5的特点
关于HTML面试题汇总之H5 一.H5有哪些新特性,移除了哪些元素?如何处理h5新标签的浏览器兼容性问题,如何区分html和html5 1. html5不在是SGL(通用标记语言)的一个子集,而包 ...
- 修复 VirtualBox 下 Ubuntu 14.10 屏幕分辨率问题
在 Windows 7 下使用 VirtualBox 安装了一个 Ubuntu 14.10 后,碰到了一个 640×480 屏幕分辨率的问题. 在 ‘Display Settings' 设置界面的 ‘ ...
- const 和宏的区别
参考:http://blog.sina.com.cn/s/blog_79b01f6601018xdg.html (1) 编译器处理方式不同 define宏是在预处理阶段展开. const常量是编译运行 ...
- HDU 1495 非常可乐
http://acm.hust.edu.cn/vjudge/contest/view.action?cid=103711#problem/M /*BFS简单题 链接地址: http://acm.hdu ...
- hdu 5492
动态规划 #include<iostream> #include<cstdio> #include<cstring> #include<algorithm&g ...
- Droid4x设置代理抓包
Droid4x也是基于virtualbox+x86架构的 代理设置 设置->WIFI->鼠标按住WiredSSID选项不放->修改网络->显示高级选项->代理-> ...
- CentOS用yum安装、配置MariaDB
.创建/etc/yum.repos.d/MariaDB.repo文件,这里用到了刚刚发布正式版的10. [mariadb] name = MariaDB baseurl = http://yum.ma ...