机器学习笔记-1 Linear Regression with Multiple Variables(week 2)
1. Multiple Features
note:X0 is equal to 1
2. Feature Scaling
Idea: make sure features are on a similiar scale, approximately a -1<Xi<1 range
For example:
x1 = size (0-2000 feet^2) max-min or standard deviation
x2 = number of bedrooms(1-5)
The contour function of theta1 and theat2 is a very skewed elliptical shape. And
if you are running gradient descent on this, your gradients may end up a long time
to fjnd the global minimum.
Cure:
x1 = size (0-5000 feet^2)/2000
x2 = number of bedrooms(1-5)/5
so the coutour plots will give a much more direct path to the minimum
Mean normalization:
Replace Xi with Xi - Ui to make features have zero mean(except X0)
Eg:
X1 = (size-1000)/2000
X2= (#bedrooms-2)/5
3. Learning Rate
We can plot the J(theata) vs number of iterations and the J(theata) should
decrease after every iteraion. and we can also see if the gradient descent converges or not.
And if gradient descent is not working, usually means that:
you should use a smaller value of alpha(learning rate)
To choose alpha():
..., 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1...
4. Features
you can try to define new features, for example:
Area = frontage * depth
Polynomial regression:
we can set that x1=size, x2=(size)^2, x3=(size)^3(remember ot feature scaling)
and it becomes linear regression
5. Normal Equations
Idea: method to solve for theta analytically
where x is m*(n-1) dimensional matrix and y is a m dimensional matrix,
n : number of features, m:number of training example
And feature scaling is not necessary for normal equations
Gradient descent
1. choose alpha
2. need many iterations
3. works well even have large number of features n.
Normal equation:
1. no need for alpha and iterations
2. need to compute matrix inverse
3. slow for large n (n = 10^6 etc)
Note is not invertible means that:
1. you have got redundant features(linearly dependent)
2. there are too many features, delete some features, or use regularization
机器学习笔记-1 Linear Regression with Multiple Variables(week 2)的更多相关文章
- 机器学习 (二) 多变量线性回归 Linear Regression with Multiple Variables
文章内容均来自斯坦福大学的Andrew Ng教授讲解的Machine Learning课程,本文是针对该课程的个人学习笔记,如有疏漏,请以原课程所讲述内容为准.感谢博主Rachel Zhang 的个人 ...
- 机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables)
机器学习(三)--------多变量线性回归(Linear Regression with Multiple Variables) 同样是预测房价问题 如果有多个特征值 那么这种情况下 假设h表示 ...
- 【原】Coursera—Andrew Ng机器学习—课程笔记 Lecture 4_Linear Regression with Multiple Variables 多变量线性回归
Lecture 4 Linear Regression with Multiple Variables 多变量线性回归 4.1 多维特征 Multiple Features4.2 多变量梯度下降 Gr ...
- 【原】Coursera—Andrew Ng机器学习—Week 2 习题—Linear Regression with Multiple Variables 多变量线性回归
Gradient Descent for Multiple Variables [1]多变量线性模型 代价函数 Answer:AB [2]Feature Scaling 特征缩放 Answer:D ...
- Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)
Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...
- Linear regression with multiple variables(多特征的线型回归)算法实例_梯度下降解法(Gradient DesentMulti)以及正规方程解法(Normal Equation)
,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, ,, , ...
- 斯坦福机器学习视频笔记 Week2 多元线性回归 Linear Regression with Multiple Variables
相比于week1中讨论的单变量的线性回归,多元线性回归更具有一般性,应用范围也更大,更贴近实际. Multiple Features 上面就是接上次的例子,将房价预测问题进行扩充,添加多个特征(fea ...
- 机器学习之多变量线性回归(Linear Regression with multiple variables)
1. Multiple features(多维特征) 在机器学习之单变量线性回归(Linear Regression with One Variable)我们提到过的线性回归中,我们只有一个单一特征量 ...
- 机器学习笔记1——Linear Regression with One Variable
Linear Regression with One Variable Model Representation Recall that in *regression problems*, we ar ...
随机推荐
- 清理浏览器网站缓存的几种方法(meta,form表单,ajax)
1.meta方法 HTML header中加入 <meta http-equiv="pragma" content="no-cache"> 说明 ...
- windows phone 8.1常用启动器实例
---恢复内容开始--- 小梦今天给大家分享一下windows phone 8.1常用启动器实例,包括: 电话启动器 短信启动器 邮件启动器 添加约会|备忘到日历 地图启动器 地图路线启动器 wind ...
- iOS开发之单例模式
1.概述 单例模式是一种常用的软件设计模式,通过单例模式可以保证系统中一个类只有一个实例而且该实例易于外界访问,从而方便对实例个数的控制并节约系统资源. 如果希望系统中某个类的对象只能存在一个,单例模 ...
- MVC学习笔记2 - Razor语法
Razor 同时支持 C# (C sharp) 和 VB (Visual Basic). C# 的主要 Razor 语法规则 Razor 代码封装于 @{ ... } 中 行内表达式(变量和函数)以 ...
- 分解机(Factorization Machines)推荐算法原理
对于分解机(Factorization Machines,FM)推荐算法原理,本来想自己单独写一篇的.但是看到peghoty写的FM不光简单易懂,而且排版也非常好,因此转载过来,自己就不再单独写FM了 ...
- 任务一:零基础HTML编码练习
任务目的 了解HTML的定义.概念.发展简史 掌握常用HTML标签的含义.用法 能够基于设计稿来合理规划HTML文档结构 理解语义化,合理地使用HTML标签来构建页面 任务描述:完成一个HTML页面代 ...
- [原]docker 操作记录
开启新容器 docker run --name 容器名字 -ti[d] 镜像 初始化命令(需要是阻塞的) 额外参数 -v 本地目录:容器目录[:ro] 映射本地路径和容器路径(时区同步.数据库dock ...
- Xmpp实现简单聊天系列 --- ①openfire部署
1. 下载最新的openfire安装文件 官方下载站点:http://www.igniterealtime.org/downloads/index.jsp#openfire 2. 下载完成后,执行你的 ...
- ArcGIS API for JavaScript 4.2学习笔记[20] 使用参数查询要素(油井和地震关系)
这个例子相当复杂.我先简单说说这个例子是干啥的. 在UI上,提供了一个下拉框.两个滑动杆,以确定三个参数,使用这三个参数进行空间查询.这个例子就颇带空间查询的意思了. 这个例子是干嘛的呢?第一个参数是 ...
- Spring基础学习(四)—AOP
一.AOP基础 1.基本需求 需求: 日志功能,在程序执行期间记录发生的活动. ArithmeticCalculate.java public interface ArithmeticCal ...