What is an eigenvector of a covariance matrix?
What is an eigenvector of a covariance matrix?
(More precisely, the first eigenvector is the direction in which the data varies the most, the second eigenvector is the direction of greatest variance among those that are orthogonal (perpendicular) to the first eigenvector, the third eigenvector is the direction of greatest variance among those orthogonal to the first two, and so on.)
Here is an example in 2 dimensions [1]:
Each data sample is a 2 dimensional point with coordinates x, y. The eigenvectors of the covariance matrix of these data samples are the vectors u and v; u, longer arrow, is the first eigenvector and v, the shorter arrow, is the second. (The eigenvalues are the length of the arrows.) As you can see, the first eigenvector points (from the mean of the data) in the direction in which the data varies the most in Euclidean space, and the second eigenvector is orthogonal (perpendicular) to the first.
It's a little trickier to visualize in 3 dimensions, but here's an attempt [2]:
In this case, imagine that all of the data points lie within the ellipsoid. v1, the direction in which the data varies the most, is the first eigenvector (lambda1 is the corresponding eigenvalue). v2 is the direction in which the data varies the most among those directions that are orthogonal to v1. And v3 is the direction of greatest variance among those directions that are orthogonal to v1 and v2 (though there is only one such orthogonal direction).
[1] Image taken from Duncan Gillies's lecture on Principal Component Analysis
[2] Image taken from Fiber Crossing in Human Brain Depicted with Diffusion Tensor MR Imaging

It turns out that the covariance of two such vectors x and y can be written as Cov(x,y)=xtAy. In particular, Var(x)=xtAx. This means that covariance is a Bilinear form.
Now, since A is a real symmetric matrix, there is an orthonormal basis for Rnof eigenvectors of A. Orthonormal in this case means that each vector's norm is 1 and they're orthogonal with respect to A, that is vt1Av2=0, or Cov(v1,v2)=0.
Next, suppose v is a unit eigenvector of A with eigenvalue λ. Then Var(v)=λ∥v∥2=λ.
There are a couple interesting conclusions we can draw from this. First, since the eigenvectors form a basis {v1,...,vn}, every linear combination of the original random variables can actually be represented as a linear combination of the independent random variables vi. Second, every unit vector's variance is a weighted average of the eigenvalues. This means that the leading eigenvector is the direction of greatest variance, the next eigenvector has the greatest variance in the orthogonal subspace, and so on.
So, sum up, eigenvectors are uncorrelated linear combinations of the original set of random variables.
The primary application of this is Principal Components Analysis. If you have n features, you can find eigenvectors of the covariance matrix of the features. This allows you to represent the data with uncorrelated features. Moreover, the eigenvalues tell you the amount of variance in each feature, allowing you to choose a subset of the features that retain the most information about your data.
Now, if this direction of the largest variance is axis-aligned (covariances are zero), then the eigenvalues simply correspond to the variances of the data:
It becomes a little more complicated if the covariance matrix is not diagonal, such that the covariances are not zero. In this case, the principal components (directions of largest variance) do no coincide with the axes, and the data is rotated. The eigenvalues then still correspond to the spread of the data in the direction of the largest variance, whereas the variance components of the covariance matrix still defines the spread of the data along the axes:
An in-depth discussion of how the covariance matrix can be interpreted from a geometric point of view (and the source of the above images) can be found on:A geometric interpretation of the covariance matrix

The eigenvectors are those variables that are linearly uncorrelated.
What is an eigenvector of a covariance matrix?的更多相关文章
- A geometric interpretation of the covariance matrix
A geometric interpretation of the covariance matrix Contents [hide] 1 Introduction 2 Eigendecomposit ...
- 方差variance, 协方差covariance, 协方差矩阵covariance matrix
https://www.jianshu.com/p/e1c8270477bc?utm_campaign=maleskine&utm_content=note&utm_medium=se ...
- 方差variance, 协方差covariance, 协方差矩阵covariance matrix | scatter matrix | weighted covariance | Eigenvalues and eigenvectors
covariance, co本能的想到双变量,用于描述两个变量之间的关系. correlation,相关性,covariance标准化后就是correlation. covariance的定义: 期望 ...
- covariance matrix 和数据分布情况估计
how to get data covariance matrix: http://stattrek.com/matrix-algebra/covariance-matrix.aspx meaning ...
- 图Lasso求逆协方差矩阵(Graphical Lasso for inverse covariance matrix)
图Lasso求逆协方差矩阵(Graphical Lasso for inverse covariance matrix) 作者:凯鲁嘎吉 - 博客园 http://www.cnblogs.com/ka ...
- 随机变量的方差variance & 随机向量的协方差矩阵covariance matrix
1.样本矩阵 如果是一个随机变量,那么它的样本值可以用一个向量表示.相对的,如果针对一个随机向量,那么就需要利用矩阵表示,因为向量中的每一个变量的采样值,都可以利用一个向量表示. 然后,一个矩阵可以利 ...
- A Beginner’s Guide to Eigenvectors, PCA, Covariance and Entropy
A Beginner’s Guide to Eigenvectors, PCA, Covariance and Entropy Content: Linear Transformations Prin ...
- Ill-conditioned covariance create
http://www.mathworks.com/matlabcentral/answers/100210-why-do-i-receive-an-error-while-trying-to-gene ...
- 协方差(Covariance)
统计学上用方差和标准差来度量数据的离散程度 ,但是方差和标准差是用来描述一维数据的(或者说是多维数据的一个维度),现实生活中我们常常会碰到多维数据,因此人们发明了协方差(covariance),用来度 ...
随机推荐
- IIS实现301重定向
301永久重定向对SEO无任何不好的影响,而且网页A的关键词排名和PR级别都会传达给网页B,网站更换了域名,表示本网页永久性转移到另一个地址,对于搜索引擎优化|SEO来说,给搜索引擎一个友好的信息,告 ...
- 【Android学习】调用系统相机
Android调用系统相机分三步走: 首先是要设置调用相机的权限. 其次是给按钮加打开相机的事件. 最后是拍照后进行图片的保存. 第一步,添加权限: <!-- 调用系统相机 --> < ...
- Swift字典类
在Foundation框架中提供一种字典集合,它是由“键-值”对构成的集合.键集合不能重复,值集合没有特殊要求.键和值集合中的元素可以是任何对象,但是不能是nil.Foundation框架字典类也分为 ...
- .net设计模式之观察者模式
摘要 在今天的设计模式系列中我给大家带来了观察者模式,首先我会以一个生动的故事引入观察者模式的应用的场景,然后描述这个场景中出现的问题,最后我们提出观察者模式的解决方案,并给出C#语言实现的代 ...
- 图像热点&图像映射
图像映射 图像映射也称为图像热点. 作用: 让同一张图片上的不同区域,可以实现多个不同的超链接功能. 图示: <map>图像映射三步走: 图像映射的实现需要三方面配合完成: 1.图像映射容 ...
- 一.Nginx的特性和一些知识点
一.Nginx的特性和一些知识点 1.基本功能服务器 处理静态文件(静态资源的web),支持 反向代理服务器,支持缓存.负载均衡.支持FastCGI 模块化机制,非DOS机制,支持 ...
- html的3要素
在HTML标记语言中可以将每个网页源码分成3部分: 1.<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" &q ...
- Web前端新人笔记之了解Jquery
与javaScript相比,Jquery更简洁.浏览器的兼容性更强,语法更灵活,对xpath的支持更强大.一个$符就可以遍历文档中各级元素.例:在页面上有一个无序列表,我们需要将所有列表项中的文本内容 ...
- Excel导入功能
一:前端 <t:dgToolBar title="Excel题库导入" icon="icon-search" onclick="question ...
- C#对象转JSON字符串和JSON字符串转对象
namespace Net.String.ConsoleApplication { using System; using System.Data; using System.Collections; ...