Multiple Features

Linear regression with multiple variables is also known as "multivariate linear regression".

We now introduce notation for equations where we can have any number of input variables.

The multivariable form of the hypothesis function accommodating these multiple features is as follows:

In order to develop intuition about this function, we can think about θ0 as the basic price of a house, θ1 as the price per square meter, θ2 as the price per floor, etc. x1 will be the number of square meters in the house, x2 the number of floors, etc.

Using the definition of matrix multiplication, our multivariable hypothesis function can be concisely represented as:

This is a vectorization of our hypothesis function for one training example; see the lessons on vectorization to learn more.

Remark: Note that for convenience reasons in this course we assume .This allows us to do matrix operations with theta and x. Hence making the two vectors 'θ' and match each other element-wise (that is, have the same number of elements: n+1).]

Gradient Descent For Multiple Variables

The gradient descent equation itself is generally the same form; we just have to repeat it for our 'n' features:

Gradient Descent in Practice I - Feature Scaling

Note: [6:20 - The average size of a house is 1000 but 100 is accidentally written instead]

We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.

The way to prevent this is to modify the ranges of our input variables so that they are all roughly the same. Ideally:

These aren't exact requirements; we are only trying to speed things up. The goal is to get all input variables into roughly one of these ranges, give or take a few.

Two techniques to help with this are feature scaling and mean normalization. Feature scaling involves dividing the input values by the range (i.e. the maximum value minus the minimum value) of the input variable, resulting in a new range of just 1. Mean normalization involves subtracting the average value for an input variable from the values for that input variable resulting in a new average value for the input variable of just zero. To implement both of these techniques, adjust your input values as shown in this formula:

Where is the average of all the values for feature (i) and s_i is the range of values (max - min), or s_i is the standard deviation.

Note that dividing by the range, or dividing by the standard deviation, give different results. The quizzes in this course use range - the programming exercises use standard deviation.

For example, if x_i represents housing prices with a range of 100 to 2000 and a mean value of 1000, then, x_i := \dfrac{price-1000}{1900}.

Gradient Descent in Practice II - Learning Rate

Note: [5:20 - the x -axis label in the right graph should be θ rather than No. of iterations ]

Debugging gradient descent. Make a plot with number of iterations on the x-axis. Now plot the cost function, J(θ) over the number of iterations of gradient descent. If J(θ) ever increases, then you probably need to decrease α.

Automatic convergence test. Declare convergence if J(θ) decreases by less than E in one iteration, where E is some small value such as 10−3. However in practice it's difficult to choose this threshold value.

It has been proven that if learning rate α is sufficiently small, then J(θ) will decrease on every iteration.

To summarize:

If α is too small: slow convergence.

If α is too large: may not decrease on every iteration and thus may not converge.

Features and Polynomial Regression

We can improve our features and the form of our hypothesis function in a couple different ways.

We can combine multiple features into one. For example, we can combine x1 and x2 into a new feature x3 by taking x1⋅x2.

Polynomial Regression

Our hypothesis function need not be linear (a straight line) if that does not fit the data well.

We can change the behavior or curve of our hypothesis function by making it a quadratic, cubic or square root function (or any other form).

One important thing to keep in mind is, if you choose your features this way then feature scaling becomes very important.

Multivariate Linear Regression的更多相关文章

  1. Machine Learning - week 2 - Multivariate Linear Regression

    Multiple Features 上一章中,hθ(x) = θ0 + θ1x,表示只有一个 feature.现在,有多个 features,所以 hθ(x) = θ0 + θ1x1 + θ2x2 + ...

  2. 多元线性回归(Multivariate Linear Regression)简单应用

    警告:本文为小白入门学习笔记 数据集: http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearnin ...

  3. Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)

    Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...

  4. 【转】Derivation of the Normal Equation for linear regression

    I was going through the Coursera "Machine Learning" course, and in the section on multivar ...

  5. 机器学习---线性回归(Machine Learning Linear Regression)

    线性回归是机器学习中最基础的模型,掌握了线性回归模型,有利于以后更容易地理解其它复杂的模型. 线性回归看似简单,但是其中包含了线性代数,微积分,概率等诸多方面的知识.让我们先从最简单的形式开始. 一元 ...

  6. Andrew Ng Machine Learning 专题【Linear Regression】

    此文是斯坦福大学,机器学习界 superstar - Andrew Ng 所开设的 Coursera 课程:Machine Learning 的课程笔记. 力求简洁,仅代表本人观点,不足之处希望大家探 ...

  7. CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression

    (1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput t ...

  8. Coursera machine learning 第二周 quiz 答案 Linear Regression with Multiple Variables

    https://www.coursera.org/learn/machine-learning/exam/7pytE/linear-regression-with-multiple-variables ...

  9. Multivariate Adaptive Regression Splines (MARSplines)

    Introductory Overview Regression Problems Multivariate Adaptive Regression Splines Model Selection a ...

随机推荐

  1. 求s=a+aa+aaa+aaaa+aa...a的值,其中a是一个数字。 例如2+22+222+2222+22222(此时共有5个数相加),几个数相加有键盘控制。

    代码: package com.liron.p1; import java.io.IOException; import java.util.Scanner; /** * 求s=a+aa+aaa+aa ...

  2. 从USB闪存驱动器启动 Hiren的BootCD --制作U盘启动盘

    从USB闪存驱动器启动 Hiren的BootCD 原文  http://www.hirensbootcd.org/usb-booting/ 本文基本上是翻译而来 要从USB闪存驱动器启动Hiren的B ...

  3. Android开发经验一判断当前屏幕是全屏还是非全屏

    public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView ...

  4. Flume的核心概念

    Event:一条数据  Client:生产数据,运行在一个独立的线程. Agent  (1)Sources.Channels.Sinks  (2)其他组件:Interceptors.Channel S ...

  5. CList 点击表头排序 (1)SortItems函数

    点击表头排序整体的思路都是去 CListCtrl类中的方法SortItems去实现 CListCtrl::SortItems的原型是: BOOL SortItems( PFNLVCOMPARE pfn ...

  6. Spring学习总结(5)——IOC注入方式总结

    一.构造注入 在类被实例化的时候,它的构造方法被调用并且只能调用一次.所以它被用于类的初始化操作.<constructor-arg>是<bean>标签的子标签.通过其<v ...

  7. 洛谷 P1626 象棋比赛

    P1626 象棋比赛 题目描述 有N个人要参加国际象棋比赛,该比赛要进行K场对弈.每个人最多参加两场对弈,最少参加零场对弈.每个人都有一个与其他人不相同的等级(用一个正整数来表示). 在对弈中,等级高 ...

  8. 11. ZooKeeper之启动、停止服务。

    转自:https://blog.csdn.net/en_joker/article/details/78673607 启动服务 首先我们来看下如何启动ZooKeeper服务.常见的启动方式有两种. J ...

  9. react取消监听scroll事件

    如果要移除事件addEventListener的执行函数必须使用外部函数而不能直接使用匿名函数 错误写法: // 这样写是移除不了滚动事件的 componentDidMount() { // 添加滚动 ...

  10. ajax实现简单的点击左侧菜单,右侧加载不同网页

    实现:ajax实现点击左侧菜单,右侧加载不同网页(在整个页面无刷新的情况下实现右侧局部刷新,用到ajax注意需要在服务器环境下运行,从HBuilder自带的服务器中打开浏览效果即可) 原理:ajax的 ...