1. %第一列为 size of House(feet^2),第二列为 number of bedroom,第三列为 price of House
    1 ,,
  2. ,,
  3. ,,
  4. ,,
  5. ,,
  6. ,,
  7. ,,
  8. ,,
  9. ,,
  10. ,,
  11. ,,
  12. ,,
  13. ,,
  14. ,,
  15. ,,
  16. ,,
  17. ,,
  18. ,,
  19. ,,
  20. ,,
  21. ,,
  22. ,,
  23. ,,
  24. ,,
  25. ,,
  26. ,,
  27. ,,
  28. ,,
  29. ,,
  30. ,,
  31. ,,
  32. ,,
  33. ,,
  34. ,,
  35. ,,
  36. ,,
  37. ,,
  38. ,,
  39. ,,
  40. ,,
  41. ,,
  42. ,,
  43. ,,
  44. ,,
  45. ,,
  46. ,,
  47. ,,
  1. 1 % Exercise 1: Linear regression with multiple variables

  2. %% Initialization
  3.  
  4. %% ================ Part 1: Feature Normalization ================
  5.  
  6. %% Clear and Close Figures
  7. clear ; close all; clc
  8.  
  9. fprintf('Loading data ...\n');
  10.  
  11. %% Load Data
  12. data = load('ex1data2.txt');
  13. X = data(:, :);
  14. y = data(:, );
  15. m = length(y);
  16.  
  17. % Print out some data points
  18. fprintf('First 10 examples from the dataset: \n');
  19. fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(:,:) y(:,:)]');
  20.  
  21. fprintf('Program paused. Press enter to continue.\n');
  22. pause;
  23.  
  24. % Scale features and set them to zero mean
  25. fprintf('Normalizing Features ...\n');
  26.  
  27. [X, mu, sigma] = featureNormalize(X);
  1. 1 %featureNormalize(X)函数实现
  2. function [X_norm, mu, sigma] = featureNormalize(X)
  3. X_norm = X; % X是需要正规化的矩阵
  4. mu = zeros(, size(X, )); % 生成 1x3 的全0矩阵
  5. sigma = zeros(, size(X, )); % 同上
  6.  
  7. % Instructions: First, for each feature dimension, compute the mean
  8. % of the feature and subtract it from the dataset,
  9. % storing the mean value in mu. Next, compute the
  10. % standard deviation of each feature and divide
  11. % each feature by it's standard deviation, storing
  12. % the standard deviation in sigma.
  13. %
  14. % Note that X is a matrix where each column is a
  15. % feature and each row is an example. You need
  16. % to perform the normalization separately for
  17. % each feature.
  18. %
  19. % Hint: You might find the 'mean' and 'std' functions useful.
  20.  
  21. % std,均方差,std(X,,)求列向量方差,std(X,,)求行向量方差。
  22.  
  23. mu = mean(X, ); %求每列的均值--即一种特征的所有样本的均值
  24. sigma = std(X); %默认同std(X,,)求列向量方差
  25. %fprintf('Debug....\n'); disp(sigma);
  26. i = ;
  27. len = size(X,); %行数
  28. while i <= len,
  29. %对每列的所有行上的样本进行normalization(归一化):(每列的所有行-该列均值)/(该列的标准差)
  30. X_norm(:,i) = (X(:,i) - mu(,i)) / (sigma(,i));
  31. i = i + ;
  32. end
  1. 1 % Add intercept term to X
  2. 2 X = [ones(m, 1) X];
  3.  
  4. %% ================ Part : Gradient Descent ================
  5.  
  6. % Instructions: We have provided you with the following starter
  7. % code that runs gradient descent with a particular
  8. % learning rate (alpha).
  9. %
  10. % Your task is to first make sure that your functions -
  11. % computeCost and gradientDescent already work with
  12. % this starter code and support multiple variables.
  13. %
  14. % After that, try running gradient descent with
  15. % different values of alpha and see which one gives
  16. % you the best result.
  17. %
  18. % Finally, you should complete the code at the end
  19. % to predict the price of a sq-ft, br house.
  20. %
  21. % Hint: By using the 'hold on' command, you can plot multiple
  22. % graphs on the same figure.
  23. %
  24. % Hint: At prediction, make sure you do the same feature normalization.
  25. %
  26.  
  27. fprintf('Running gradient descent ...\n');
  28.  
  29. % Choose some alpha value
  30. alpha = 0.03; % learning rate - 可尝试0.,0.03,0.1,0.3...
  31. num_iters = ; % 迭代次数
  32.  
  33. % Init Theta and Run Gradient Descent
  34. theta = zeros(, ); % 3x1的全零矩阵
  35. [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
  1. % gradientDescentMulti()函数实现
    1 function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)

  2. % Initialize some useful values
  3. m = length(y); % number of training examples
  4. feature_number = size(X,); % number of feature
  5.  
  6. J_history = zeros(num_iters, );
  7. temp = zeros(feature_number, );
  8.  
  9. for iter = : num_iters
  10. predictions = X * theta;
  11. sqrError = (predictions - y);
  12. for i = : feature_number % Simultneously update theta(i) (同时更新)
  13. temp(i) = theta(i) - (alpha / m) * sum(sqrError .* X(:,i));
  14. end
  15.  
  16. for j = : feature_number
  17. theta(j) = temp(j);
  18. end
  19.  
  20. % Instructions: Perform a single gradient step on the parameter vector
  21. % theta.
  22. %
  23. % Hint: While debugging, it can be useful to print out the values
  24. % of the cost function (computeCostMulti) and gradient here.
  25. %
  26.  
  27. % ============================================================
  28.  
  29. % Save the cost J in every iteration
  30. J_history(iter) = computeCostMulti(X, y, theta);
  31. 36 % disp(J_history(iter));
  32.  
  33. end
  34.  
  35. end
  1. 1 % Plot the convergence graph
  2. figure;
  3. plot(:numel(J_history), J_history, '-b', 'LineWidth', ); % '-b'--用蓝线绘制图像,线宽为2
  4. xlabel('Number of iterations');
  5. ylabel('Cost J');
  6.  
  7. % Display gradient descent's result
  8. fprintf('Theta computed from gradient descent: \n');
  9. fprintf(' %f \n', theta);
  10. fprintf('\n');
  1. Tip:
    To compare how dierent learning learning
  2. rates aect convergence, it's helpful to plot J for several learning rates
  3. on the same gure. In Octave/MATLAB, this can be done by perform-
  4. ing gradient descent multiple times with a `hold on' command between
  5. plots. Concretely, if you've tried three dierent values of alpha (you should
  6. probably try more values than this) and stored the costs in J1, J2 and
  7. J3, you can use the following commands to plot them on the same gure:
  8. plot(1:50, J1(1:50), `b');
  9. hold on;
  10. plot(1:50, J2(1:50), `r');
  11. plot(1:50, J3(1:50), `k');
  12. The nal arguments `b', `r', and `k' specify dierent colors for the
  13. plots.
  1. 1 % 上面的Tip实现如: 可以添加本段代码进行比较 不同的learning rate
  2. 2 figure;
  3. 3 plot(1:100, J_history(1:100), '-b', 'LineWidth', 2);
  4. 4 xlabel('Number of iterations');
  5. 5 ylabel('Cost J');
  6. 6
  7. 7 % Compare learning rate
  8. 8 hold on;
  9. 9 alpha = 0.03;
  10. 10 theta = zeros(3, 1);
  11. 11 [theta, J_history1] = gradientDescentMulti(X, y, theta, alpha, num_iters);
  12. 12 plot(1:100, J_history1(1:100), 'r', 'LineWidth', 2);
  13. 13
  14. 14 hold on;
  15. 15 alpha = 0.1;
  16. 16 theta = zeros(3, 1);
  17. 17 [theta, J_history2] = gradientDescentMulti(X, y, theta, alpha, num_iters);
  18. 18 plot(1:100, J_history2(1:100), 'g', 'LineWidth', 2);
  1. 1 % 利用梯度下降算法预测新值
  2. price = [, X(:)] * theta; %利用矩阵乘法--预测多特征下的price
  3.  
  4. % ============================================================
  5.  
  6. fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
  7. '(using gradient descent):\n $%f\n'], price);
  8.  
  9. fprintf('Program paused. Press enter to continue.\n');
  10. pause;
  1. 1 %% ================ Part 3: Normal Equations ================
  2. 2 %利用正规方程预测新值(Normal Equation)
  3. fprintf('Solving with normal equations...\n');
  4.  
  5. %% Load Data
  6. data = csvread('ex1data2.txt');
  7. X = data(:, :);
  8. y = data(:, );
  9. m = length(y);
  10.  
  11. % Add intercept term to X
  12. X = [ones(m, ) X];
  13.  
  14. % Calculate the parameters from the normal equation
  15. theta = normalEqn(X, y);
  1. % normalEquation的实现
    1 function [theta] = normalEqn(X, y)

  2. theta = zeros(size(X, ), );
  3.  
  4. % Instructions: Complete the code to compute the closed form solution
  5. % to linear regression and put the result in theta.
  6.  
  7. theta = pinv(X' * X) * X' * y;
  8.  
  9. end
  1. 1 % Display normal equation's result
  2. fprintf('Theta computed from the normal equations: \n');
  3. fprintf(' %f \n', theta);
  4. fprintf('\n');
  5.  
  6. % Estimate the price of a sq-ft, br house
  7.  
  8. price = ;
  9. price = [, X(:)] * theta; %利用正规方程预测新值
  10.  
  11. fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
  12. '(using normal equations):\n $%f\n'], price);

Linear regression with multiple variables(多特征的线型回归)算法实例_梯度下降解法(Gradient DesentMulti)以及正规方程解法(Normal Equation)的更多相关文章

  1. Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)

    Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...

  2. 机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables)

    机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables) 同样是预测房价问题  如果有多个特征值 那么这种情况下  假设h表示 ...

  3. 【原】Coursera—Andrew Ng机器学习—Week 2 习题—Linear Regression with Multiple Variables 多变量线性回归

    Gradient Descent for Multiple Variables [1]多变量线性模型  代价函数 Answer:AB [2]Feature Scaling 特征缩放 Answer:D ...

  4. 机器学习 (二) 多变量线性回归 Linear Regression with Multiple Variables

    文章内容均来自斯坦福大学的Andrew Ng教授讲解的Machine Learning课程,本文是针对该课程的个人学习笔记,如有疏漏,请以原课程所讲述内容为准.感谢博主Rachel Zhang 的个人 ...

  5. ML:多变量代价函数和梯度下降(Linear Regression with Multiple Variables)

    代价函数cost function 公式: 其中,变量θ(Rn+1或者R(n+1)*1) 向量化: Octave实现: function J = computeCost(X, y, theta) %C ...

  6. Coursera machine learning 第二周 quiz 答案 Linear Regression with Multiple Variables

    https://www.coursera.org/learn/machine-learning/exam/7pytE/linear-regression-with-multiple-variables ...

  7. 机器学习之多变量线性回归(Linear Regression with multiple variables)

    1. Multiple features(多维特征) 在机器学习之单变量线性回归(Linear Regression with One Variable)我们提到过的线性回归中,我们只有一个单一特征量 ...

  8. 斯坦福机器学习视频笔记 Week2 多元线性回归 Linear Regression with Multiple Variables

    相比于week1中讨论的单变量的线性回归,多元线性回归更具有一般性,应用范围也更大,更贴近实际. Multiple Features 上面就是接上次的例子,将房价预测问题进行扩充,添加多个特征(fea ...

  9. #Week3 Linear Regression with Multiple Variables

    一.Multiple Features 这节课主要引入了一些记号,假设现在有n个特征,那么: 为了便于用矩阵处理,令\(x_0=1\): 参数\(\theta\)是一个(n+1)*1维的向量,任一个训 ...

随机推荐

  1. 什么是H标签?H1,H2,H3标签?以及和strong标签使用的方法及重要性

    大家都知道,seo的一个很重要的一点就是要把网站做的条理清晰,让搜索引擎很容易的读明白,这个条理清晰不仅体现在网站的物理路径,url等地 方.在<h1><h2><h3&g ...

  2. java之BASE64加解密

    1.简介 Base64是网络上最常见的用于传输8Bit字节代码的编码方式之一,采用Base64编码具有不可读性,即所编码的数据不会被人用肉眼所直接看到. 注:位于jdk的java.util包中. 2. ...

  3. Oracle 数据库1046事件

    例子: session 2: SQL> connect test/test Connected. select * from v$mystat where rownum=1; 143 selec ...

  4. h5的特点

    关于HTML面试题汇总之H5   一.H5有哪些新特性,移除了哪些元素?如何处理h5新标签的浏览器兼容性问题,如何区分html和html5 1. html5不在是SGL(通用标记语言)的一个子集,而包 ...

  5. 修复 VirtualBox 下 Ubuntu 14.10 屏幕分辨率问题

    在 Windows 7 下使用 VirtualBox 安装了一个 Ubuntu 14.10 后,碰到了一个 640×480 屏幕分辨率的问题. 在 ‘Display Settings' 设置界面的 ‘ ...

  6. const 和宏的区别

    参考:http://blog.sina.com.cn/s/blog_79b01f6601018xdg.html (1) 编译器处理方式不同 define宏是在预处理阶段展开. const常量是编译运行 ...

  7. HDU 1495 非常可乐

    http://acm.hust.edu.cn/vjudge/contest/view.action?cid=103711#problem/M /*BFS简单题 链接地址: http://acm.hdu ...

  8. hdu 5492

    动态规划 #include<iostream> #include<cstdio> #include<cstring> #include<algorithm&g ...

  9. Droid4x设置代理抓包

    Droid4x也是基于virtualbox+x86架构的   代理设置 设置->WIFI->鼠标按住WiredSSID选项不放->修改网络->显示高级选项->代理-> ...

  10. CentOS用yum安装、配置MariaDB

    .创建/etc/yum.repos.d/MariaDB.repo文件,这里用到了刚刚发布正式版的10. [mariadb] name = MariaDB baseurl = http://yum.ma ...