6. Eigenvalues and Eigenvectors
Keys:
- What are Eigenvalues and Eigenvectors?
- How to find Eigenvalues and Eigenvectors?
- Applications of Egenvalues and Eigenvectors:
- Difference equation \(u_{k+1}=Au_k\)
- Solution of \(\frac{du}{dt}=Au\)
- Markov Matrices
- Projections and Fourier Series
- Special Matrix
- Symmetric Matrices
- Positive Definite Matrix
- Similar Matrices
- Jordan Theorem
6.1 Introduction to Eigenvalues and Eigenvectors
keys:
- If X lies along the same direction as AX : \(AX = \lambda X\),then \(\lambda\) is eigenvalue and X is eigenvector.
- If \(AX=\lambda X\) then \(A^2X=\lambda^2 X\) and \(A^{-1}X=\lambda^{-1} X\) and \((A+cI)X=(\lambda + c) X\) : the same eigenvector X.
- If \(AX=\lambda X\) then \((A-\lambda I)X=0\) and \(A-\lambda I\) is singular and \(det(A-\lambda I)=0\) can find eigenvalues and eigenvectors.
- Check : \(\lambda_1 + \lambda_2 + \cdots + \lambda_n = a_{11} + a_{22} + \cdots + a_{nn}\)
- Projection Matrix : \(\lambda = 1 \ and \ 0\);Reflections Matrix : \(\lambda = 1 \ and \ -1\);Rotations Matrix : \(\lambda = e^{i \theta} \ and \ e^{-i \theta}\)。
The Equation for the Eigenvalues and Eigenvectors
- Compute the determinant of \(A-\lambda I\).
- Find the roots of the polynomial of the determinant of \(A-\lambda I\),by solving det(\(A-\lambda I\)) = 0.
- For each eigenvalue \(\lambda\),solve \((A-\lambda I)X = 0\) to find an eigenvector X.
example:
\Downarrow \\
solve \ \ characteristic \ \ equation \\
det (A-\lambda I) = \left | \begin{matrix} -\lambda&1 \\ 1&-\lambda \end{matrix} \right| \\
\lambda_1 = 1 \ , \ x_1 = \left[ \begin{matrix} 1 \\ 1 \end{matrix} \right] \\
\lambda_2 = -1 \ , \ x_2 = \left[ \begin{matrix} 1 \\ -1 \end{matrix} \right] \\
check: \lambda_1 + \lambda_2 = a_{11} + a_{22} = 0,\ \ \lambda_1 \lambda_2 = detA = -1
\\
\]
\Downarrow \\
solve \ \ characteristic \ \ equation \\
det (B-\lambda I) = \left | \begin{matrix} 3-\lambda&1 \\ 1&3-\lambda \end{matrix} \right| \\
\lambda_1 = 4 \ , \ x_1 = \left[ \begin{matrix} 1 \\ 1 \end{matrix} \right] \\
\lambda_2 = 2 \ , \ x_2 = \left[ \begin{matrix} 1 \\ -1 \end{matrix} \right] \\
check: \lambda_1 + \lambda_2 = a_{11} + a_{22} = 6,\ \ \lambda_1 \lambda_2 = detB = 8
\\
\]
If \(AX=\lambda X\),the \((A+nI)X = \lambda X + nIX = (\lambda + n)X\);If eigenvectors of A is the same as eigenvectors of B, the \((A+B)X=(\lambda_{A} + \lambda_{B})X\).
Diagonalizing a Matrix
Eigenvectors of A for n different \(\lambda's\) are independent.Then we can diagonalize A.
The columns of X are eigenvectors.
So:
= A \left[ \begin{matrix} x_1&x_2&\cdots&x_n\end{matrix} \right] \\
= \left[ \begin{matrix} \lambda_1x_1&\lambda_2x_2&\cdots&\lambda_2x_n\end{matrix} \right] \\
= \left[ \begin{matrix} x_1&x_2&\cdots&x_n\end{matrix} \right] \left[ \begin{matrix} \lambda_1&& \\
&\ddots&\\
&&\lambda_n
\end{matrix} \right] \\
=X\Lambda \\
\Downarrow \\
AX=X\Lambda \\
X^{-1}AX=\Lambda \ or \ A=X\Lambda X^{-1} \\
\Downarrow \\
A^k =(X\Lambda X^{-1})_1(X\Lambda X^{-1})_2\cdots (X\Lambda X^{-1})_k = X\Lambda^k X^{-1}
\]
example:
\left[ \begin{matrix} 1&1 \\ 0&1 \end{matrix} \right]
\left[ \begin{matrix} 1&0 \\ 0&6 \end{matrix} \right]
\left[ \begin{matrix} 1&1 \\ 0&-1 \end{matrix} \right]
\\
\left[ \begin{matrix} 1&5 \\ 0&6 \end{matrix} \right]^k =
\left[ \begin{matrix} 1&1 \\ 0&1 \end{matrix} \right]
\left[ \begin{matrix} 1&0 \\ 0&6 \end{matrix} \right]^k
\left[ \begin{matrix} 1&1 \\ 0&-1 \end{matrix} \right] =
\left[ \begin{matrix} 1&1 \\ 0&1 \end{matrix} \right]
\left[ \begin{matrix} 1^k&0 \\ 0&6^k \end{matrix} \right]
\left[ \begin{matrix} 1&1 \\ 0&-1 \end{matrix} \right]
\]
When all \(|\lambda_i| < 0\),the \(A^k \rightarrow 0\).
6.2 Applications of Eigenvalue and Eigenvector
Difference equation \(u_{k+1} = Au_k\)
Matrix Powers \(A^k\) : \(u_{k}=A^ku_0 = (X \Lambda X^{-1})(X \Lambda X^{-1})\cdots(X \Lambda X^{-1})u_0=X \Lambda^k X^{-1}u_0\)
step1 :
\left[ \begin{matrix} x_1&x_2&\cdots&x_n \end{matrix}\right]
\left[ \begin{matrix} c_1\\c_2\\\vdots\\c_n \end{matrix}\right] = Xc \\
\Downarrow \\
c = X^{-1}u_0
\]
step2~3:
\left[ \begin{matrix} x_1&x_2&\cdots&x_n \end{matrix}\right]
\left[ \begin{matrix}
(\lambda_1)^k&& \\
&(\lambda_2)^k \\
&&\ddots \\
&&&(\lambda_n)^k\end{matrix} \right]
\left[ \begin{matrix} c_1\\c_2\\\vdots\\c_n \end{matrix}\right] \\
\Downarrow \\
u_k = c_1(\lambda_1)^kx_1 + c_2(\lambda_2)^kx_2 + \cdots + c_n(\lambda_n)^kx_n
\]
It solves \(u_{k+1} = Au_k\)
example:
Fibonacci Numbers: 0,1,1,2,3,5,8,13...
\(F_{k+2}=F_{k+1}+F_{k}\)
Let \(u_k = \left[ \begin{matrix} F_{k+1}\\F_k \end{matrix}\right]\)
F_{k+1} = F_{k+1} \\
\Downarrow \\
u_{k+1}= \left[ \begin{matrix} 1&1\\1&0 \end{matrix} \right]u_{k} \\
\Downarrow \\
A=\left[ \begin{matrix} 1&1\\1&0 \end{matrix} \right] \\
det(A-\lambda I) = 0 \\
\Downarrow \\
\lambda_1 = \frac{1+\sqrt{5}}{2} =1.618, \ \
x_1=\left[ \begin{matrix} \lambda_1\\1\end{matrix}\right] \\
\lambda_2 = \frac{1-\sqrt{5}}{2} =-0.618, \ \
x_2=\left[ \begin{matrix} \lambda_2\\1\end{matrix}\right] \\
and \\
u_0 = \left[ \begin{matrix} 1\\0 \end{matrix}\right] =
c_1x_1 + c_2x_2
\rightarrow
c_1 = \frac{1}{\lambda_1 - \lambda_2}, c_2 = \frac{1}{\lambda_2 - \lambda_1} \\
\Downarrow \\
u_k = c_1(\lambda_1)^kx_1 + c_2(\lambda_2)^kx_2\\
u_{100} = \frac{(\lambda_1)^{100}x_1-(\lambda_2)^{100}x_2}{\lambda_1 - \lambda_2}
\]
Solution of du/dt = Au
key : \(e^{At}\)
Taylor Series : \(e^x = 1 + x + \frac{1}{2}x^2+\cdots+\frac{1}{n!}x^n\)
S is eigenvectors matrix of A.
A = S\Lambda S^{-1} \\
I = SS^{-1} \\
\Downarrow \\
e^{At} = SS^{-1} + S\Lambda S^{-1}t + \frac{1}{2}(S\Lambda S^{-1}t)^2+\cdots+\frac{1}{n!}(S\Lambda S^{-1}t)^n \\
=S (I+ \Lambda t + \frac{1}{2}(\Lambda t)^2+\cdots+\frac{1}{n!}(\Lambda t)^n)S^{-1} \\
\Downarrow \\
\Lambda = \left[ \begin{matrix}
\lambda_1&& \\
&\lambda_2 \\
&&\ddots \\
&&&\lambda_n\end{matrix} \right] \\
e^{\Lambda t} = \left[ \begin{matrix}
e^{\lambda_1t}&& \\
&e^{\lambda_2t} \\
&&\ddots \\
&&&e^{\lambda_nt}\end{matrix} \right] \\
\Downarrow \\
e^{At}=Se^{\Lambda t}S^{-1}
\]
Solve Steps:
Find eigenvalues and eigenvectors of A by solving \(det(A-\lambda I)=0\).
Write u(0) as a combination \(c_1x_1 + c_2x_2 + \cdots + c_nx_n\) of the eigenvectors of A.
Multiply each eigenvector \(x_i\) by its growth factor \(e^{\lambda_i t}\).
The solution is the combinations of those pure solutions \(e^{\lambda t}x\).
\[\frac{du}{dt} = Au \\
u(t) = c_1e^{\lambda_1 t}x_1 + c_2e^{\lambda_2 t}x_2 + \cdots + c_ne^{\lambda_n t}x_n
\]
example:
\frac{du_2}{dt} = u_1 - 2u_2 \\
\Downarrow step1 \\
u' = Au = \left[ \begin{matrix} -1&2 \\ 1&-2 \end{matrix} \right] u \\
\lambda_1 = 0, x_1 = \left[ \begin{matrix} 2\\1 \end{matrix}\right] \\
\lambda_2 = -3, x_2 = \left[ \begin{matrix} -1\\1 \end{matrix}\right] \\
\Downarrow step2 \\
u(0) = \left[ \begin{matrix} 1\\0 \end{matrix} \right] =
c_1x_1 + c_2x_2 \\
c_1 = 1/3, c_2 = -1/3 \\
\Downarrow step3 \\
u(t) = c_1e^{\lambda_1 t}x_1 + c_2e^{\lambda_2 t}x_2
= 1/3 \left[ \begin{matrix} 2 \\ 1 \end{matrix} \right] -
1/3 e^{-3t}\left[ \begin{matrix} -1 \\ 1 \end{matrix} \right] \\
\Downarrow steady \ \ state\\
u(\infty) = 1/3 \left[ \begin{matrix} 2 \\ 1 \end{matrix} \right]
\]
State:
- Stabillity : \(u(t) -> 0 (e^{\lambda t}->0, real\ \ part\ \ \lambda < 0)\)
- Steady State : \(\lambda_1 = 0\) and other real part \(\lambda's < 0\)
- Blow up if any real part \(\lambda > 0\)
Markov Matrices
keys:
- All entries >=0.
- All columns add to 1.
- \(\lambda =1\) is one of eigenvalues.
- All other \(|\lambda_i|<1\).
- \(u_k = A^{k}u_0 = c_1\lambda_1^{k}x_1 + c_2\lambda_2^{k}x_2 + \cdots + c_n\lambda_n^{k}x_n \rightarrow c_1x_1 \ \ (steady \ \ state)\)
example: people movement model
\(u_{k+1} = Au_{k}\),A is Markov Matrix.
\left [ \begin{matrix} 0.9&0.2 \\ 0.1&0.8 \end{matrix}\right]
\left [ \begin{matrix} u_{col} \\ u_{mass} \end{matrix}\right]_{t=k} \\
\Downarrow \\
\lambda_1 = 1, x_1=\left [ \begin{matrix} 2 \\ 1 \end{matrix}\right] \\
\lambda_2 = 0.7, x_2=\left [ \begin{matrix} -1 \\ 1 \end{matrix}\right] \\
\]
if \(\left [ \begin{matrix} u_{col} \\ u_{mass} \end{matrix}\right]_{0} = \left [ \begin{matrix} 0 \\ 1000 \end{matrix}\right]\) , and \(c_1=1000/3, c_2=2000/3\)
\(u_k = c_1\lambda_1^{k}x_1+c_2\lambda_2^{k}x_2 = \frac{1000}{3}1^{k}\left [ \begin{matrix} 2 \\ 1 \end{matrix}\right] + \frac{2000}{3}0.7^{k}\left [ \begin{matrix} -1 \\ 1 \end{matrix}\right] \rightarrow \frac{1000}{3}\left [ \begin{matrix} 2 \\ 1 \end{matrix}\right]\) (steady state)
?Projections and Fourier Series
Projections with orthonormal basis:
V = x_1q_1 + x_2q_2 + \cdots + x_nq_n =
\left [ \begin{matrix} q_1&q_2&\cdots&q_n \end{matrix}\right]
\left [ \begin{matrix} x_1\\x_2\\\vdots\\x_n \end{matrix}\right]
=QX \\
\Downarrow \\
Q^{-1}V = Q^{-1}QX \\
\Downarrow \\
Q^{T}V = X
\]
Fourier series:
\(f(x) = a_0 + a_1cosx + b_1sinx + a_2cos2x + b_2sin2x + \cdots + b_nsinnx\)
(\(1,cosx,sinx,cos2x,sin2x...\)) are basis of f(x)
check: \(f(x) = f(x+ 2\pi)\)
\(f^Tg = \int_{0}^{2\pi}f(x)g(x)dx=0\) with f(x) = 1,cosx,sinx,cos2x,sin2x..., g(x) = 1,cosx,sinx,cos2x,sin2x..., \(f(x) \neq g(x)\)
example:
\(\int_{0}^{2\pi}f(x)cosxdx= \int_{0}^{2\pi}(a_0cosx + a_1(cosx)^2 + b_1cosxsinx...)dx= a_1\int_{0}^{2\pi} (cosx)^2 dx = a_1\pi\)
\(a_1 = \frac{1}{\pi}\int_{0}^{2\pi}f(x)cosxdx\)
6.3 Special Matrix
6.3.1 Symmetric Matrices
keys:
- A symmetric matrix S has n real eigenvalues \(\lambda_i\) and n orthonormal eigenvectors \(q_1,q_2,...,q_n\).
- Every real symmetric S can be diagonalized: \(S=Q \Lambda Q^{-1} = Q \Lambda Q^{T} =\left[ \begin{matrix} q_1&q_2&\cdots&q_n \end{matrix}\right]
\left[ \begin{matrix}
\lambda_1&& \\
&\lambda_2 \\
&&\ddots \\
&&&\lambda_n\end{matrix} \right]
\left[ \begin{matrix} q_1^{T}\\q_2^{T}\\\vdots\\q_n^{T} \end{matrix}\right]\). - The number of positive eigenvalues of S equals the number of positive pivots.
- Antisymmetric matrices \(A = A^{-T}\) have imaginary \(\lambda's\) and orthonormal (complex) q's.
example:
S-\lambda I = \left[ \begin{matrix} 1-\lambda&2 \\ 2&4-\lambda \end{matrix}\right]\\
\Downarrow\\
\lambda_1 = 0, x_1=\left[ \begin{matrix} 2 \\ -1 \end{matrix}\right] \\
\lambda_2 = 5, x_2=\left[ \begin{matrix} 1 \\ 2 \end{matrix}\right] \\
\Downarrow\\
Q^{-1}SQ = \frac{1}{\sqrt{5}}
\left[ \begin{matrix} 2&-1 \\ 1&2 \end{matrix}\right]
\left[ \begin{matrix} 1&2 \\ 2&4 \end{matrix}\right]
\frac{1}{\sqrt{5}}\left[ \begin{matrix} 2&1 \\ -1&2 \end{matrix}\right]
=\left[ \begin{matrix} 0&0 \\ 0&5 \end{matrix}\right] = \Lambda
\]
6.3.2 Positive Definite Matrix
keys:
Symmetric S : all eigenvalues > 0 \(\Leftrightarrow\) all pivots > 0 \(\Leftrightarrow\) all upper left determinants > 0
The Symmetric S is the postive definite : \(x^TSx > 0\) for all vectors \(x\neq0\).
\(A^TA\) is positive definite matrix.
proof: A is m by n
\[x^T(A^TA)x = (Ax)^T(Ax) = |Ax|^2 >= 0 \\
if \ \ A \ \ rank=n \\
|Ax|^2 >0
\]\(A^TA\) is positive definite matrix.
\(A^TA\) is invertible, that \(\widehat{x} = (A^TA)^{-1}A^Tb\) work fine.
example:
pivots : 2,3/2,4/3 >0 \\
left \ \ upper \ \ det : 2,3,4 >0 \\
eigenvalues : 2-\sqrt{2},2,2+\sqrt{2} \\
f = x^TSx = 2x_1^2 + 2x_2^2 + 2x_3^2-2x_1x_2-2x_2x_3 = (x_1-x_2)^2 + (x_2-x_3)^2 + x_1^2 > 0
\]
so A is positive definite matrix.
Minimum :
First derivatives : \(\frac{\partial f}{\partial x_1} = \frac{\partial f}{\partial x_2} = \frac{\partial f}{\partial x_3} =0\)
Second derivatives : \(\frac{\partial^2 f}{\partial x_1^2} = \frac{\partial^2 f}{\partial x_2^2} = \frac{\partial^2 f}{\partial x_3^2} >0\)
Maximum :
First derivatives : \(\frac{\partial f}{\partial x_1} = \frac{\partial f}{\partial x_2} = \frac{\partial f}{\partial x_3} =0\)
Second derivatives : \(\frac{\partial^2 f}{\partial x_1^2} = \frac{\partial^2 f}{\partial x_2^2} = \frac{\partial^2 f}{\partial x_3^2} <0\)
when \(f = x^TAx = 2x_1^2 + 2x_2^2 + 2x_3^2-2x_1x_2-2x_2x_3 = (x_1-x_2)^2 + (x_2-x_3)^2 + x_1^2 = 1\)
\(x^TAx=1\) describe an ellipse in 4D, with \(A=Q\Lambda Q^{T}\), Q are the directions of the principal axes, \(\Lambda\) are the lengths of those axes.
6.3.3 Similar Matrices
if \(B = M^{-1}AM\) for some matrix M, that A and B are similar.
example: \(A = \left [ \begin{matrix} 2&1 \\ 1&2 \end{matrix}\right]\)
Special example: A is similar to \(\Lambda\),\(S^{-1}A S = \Lambda \ 或 \ A=S^{-1}\Lambda S \Rightarrow \Lambda = \left [ \begin{matrix} 3&0 \\ 0&1 \end{matrix}\right]\);
other :
\[B = M^{-1}AM =\left [ \begin{matrix} 1&-4 \\ 0&1 \end{matrix}\right]
\left [ \begin{matrix} 2&1 \\ 1&2 \end{matrix}\right]
\left [ \begin{matrix} 1&4 \\ 0&1 \end{matrix}\right]
=
\left [ \begin{matrix} -2&-15 \\ 1&6 \end{matrix}\right]
\]\(A,\Lambda,B\) have the same \(\lambda's\).
- A and \(\Lambda\) with same eigenvalues and eigenvectors.
- A and B with same eigenvalues and numbers of eigenvectors, different eigenvectors.(\(X_B=M^{-1}X_A\))
?6.3.4 Jordan Theorem
Every square A is similar to a Jordan matrix:
Numbers of Jordan blocks is equal to numbers of eigenvectors.
\]
Good : \(J=\Lambda\),(d=n)
6. Eigenvalues and Eigenvectors的更多相关文章
- OpenCascade Eigenvalues and Eigenvectors of Square Matrix
OpenCascade Eigenvalues and Eigenvectors of Square Matrix eryar@163.com Abstract. OpenCascade use th ...
- A.Kaw矩阵代数初步学习笔记 10. Eigenvalues and Eigenvectors
“矩阵代数初步”(Introduction to MATRIX ALGEBRA)课程由Prof. A.K.Kaw(University of South Florida)设计并讲授. PDF格式学习笔 ...
- 方差variance, 协方差covariance, 协方差矩阵covariance matrix | scatter matrix | weighted covariance | Eigenvalues and eigenvectors
covariance, co本能的想到双变量,用于描述两个变量之间的关系. correlation,相关性,covariance标准化后就是correlation. covariance的定义: 期望 ...
- Applying Eigenvalues to the Fibonacci Problem
http://scottsievert.github.io/blog/2015/01/31/the-mysterious-eigenvalue/ The Fibonacci problem is a ...
- 【线性代数】6-1:特征值介绍(Introduction to Eigenvalues)
title: [线性代数]6-1:特征值介绍(Introduction to Eigenvalues) categories: Mathematic Linear Algebra keywords: ...
- OpenCV人脸识别Eigen算法源码分析
1 理论基础 学习Eigen人脸识别算法需要了解一下它用到的几个理论基础,现总结如下: 1.1 协方差矩阵 首先需要了解一下公式: 共公式可以看出:均值描述的是样本集合的平均值,而标准差描述的则是样本 ...
- Function Set in OPEN CASCADE
Function Set in OPEN CASCADE eryar@163.com Abstract. The common math algorithms library provides a C ...
- Introduction to graph theory 图论/脑网络基础
Source: Connected Brain Figure above: Bullmore E, Sporns O. Complex brain networks: graph theoretica ...
- Matrix Factorization SVD 矩阵分解
Today we have learned the Matrix Factorization, and I want to record my study notes. Some kownledge ...
- OpenCV中的矩阵操作
函数 Description 说明 cvAdd Elementwise addition of two arrays 两个数组对应元素的和 cvAddS Elementwise addition of ...
随机推荐
- DataGear 制作基于 three.js 的 3D 数据可视化看板
DataGear专业版 1.0.0 已发布,欢迎试用! http://datagear.tech/pro/ DataGear 支持采用原生的HTML.JavaScript.CSS制作数据可视化看板,也 ...
- 【Azure 应用服务】App Service for Linux环境中,如何解决字体文件缺失的情况
问题描述 部署在App Service for Linux环境中的Web App.出现了字体文件缺失的问题,页面显示本来时中文的地方,区别变为方框占位. 问题分析 在应用中,通常涉及到显示问题的有两个 ...
- 【Azure Fabric Service】Service Fabric 遇见错误信息记录 - The process/container terminated with exit code:2148734499
问题描述 Service Fabric 在升级 Application 过程中,发布了新的代码后,启动应用中遇见了如下错误: Error message: System.Hosting' report ...
- 【Azure 应用服务】如何查看App Service Java堆栈JVM相关的参数默认配置值?
问题描述 如何查看App Service Java堆栈JVM相关的参数默认配置值? 问题解答 可以通过App Service的高级管理工具(kudu:)来查看JVM的相关参数,使用命令:java -X ...
- JAVA微服务分布式事务的几种实现方式
基础理论 CAP理论 一致性(Consistency) :在分布式系统中所有的数据备份,在同一时刻都保持一致状态,如无法保证状态一致,直接返回错误: 可用性(Availability):在集群中一部分 ...
- 11 .Codeforces Round 891 (Div. 3)E. Power of Points(推公式+前缀和优化)
E. Power of Points 题解参考 #include <bits/stdc++.h> #define int long long #define rep(i, a, b) fo ...
- springboot中使用restTemplate发送带参数和请求头的post,get请求
最近在工作中使用到了用restTemplate去获取网站数据填入到数据库中,在这里记录下来以便以后使用: 添加相关依赖:版本使用springboot中的 <dependency> < ...
- Python根据时间命名并创建文件源码
自己写的,产品中验证ok的代码,直接上实例: import time def file_create_func(): loca = time.strftime('%Y-%m-%d-%H-%M-%S') ...
- AQS很难,面试不会?看我一篇文章吊打面试官
AQS很难,面试不会?看我一篇文章吊打面试官 大家好,我是小高先生.在这篇文章中,我将和大家深入探索Java并发包(JUC)中最为核心的概念之一 -- AbstractQueuedSynchroniz ...
- 5G+实时云渲染,助力虚拟仿真实训教学升级
随着新冠疫情走向全球大流行的发展趋势,学校教育被迫迁徙到线上教学平台,供需平衡被打破,疫情让"在线教学"成为"口罩式的刚需". 我们看到互联网+教育带来便利的同 ...