特征向量-Eigenvalues_and_eigenvectors#Graphs 线性变换
总结:
1、线性变换运算封闭,加法和乘法
2、特征向量经过线性变换后方向不变
https://en.wikipedia.org/wiki/Linear_map
Examples of linear transformation matrices
In two-dimensional space R2 linear maps are described by 2 × 2 real matrices. These are some examples:
- rotation
- by 90 degrees counterclockwise:
- by an angle θ counterclockwise:
- by 90 degrees counterclockwise:
- reflection
- about the x axis:
- about the y axis:
- about the x axis:
- scaling by 2 in all directions:
- horizontal shear mapping:
- squeeze mapping:
- projection onto the y axis:
In mathematics, a linear map (also called a linear mapping, linear transformation or, in some contexts, linear function) is a mapping V → W between two modules (including vector spaces) that preserves (in the sense defined below) the operations of addition and scalar multiplication.
An important special case is when V = W, in which case the map is called a linear operator,[1] or an endomorphism of V. Sometimes the term linear function has the same meaning as linear map, while in analytic geometry it does not.
A linear map always maps linear subspaces onto linear subspaces (possibly of a lower dimension);[2] for instance it maps a plane through the origin to a plane, straight line or point. Linear maps can often be represented as matrices, and simple examples include rotation and reflection linear transformations.
In the language of abstract algebra, a linear map is a module homomorphism. In the language of category theory it is a morphism in the category of modules over a given ring.
Definition and first consequences
Let and
be vector spaces over the same field
A function
is said to be a linear map if for any two vectors
and any scalar
the following two conditions are satisfied:
additivity / operation of addition | |
homogeneity of degree 1 / operation of scalar multiplication |
Thus, a linear map is said to be operation preserving. In other words, it does not matter whether you apply the linear map before or after the operations of addition and scalar multiplication.
This is equivalent to requiring the same for any linear combination of vectors, i.e. that for any vectors and scalars
the following equality holds:[3][4]
Denoting the zero elements of the vector spaces and
by
and
respectively, it follows that
Let
and
in the equation for homogeneity of degree 1:
Occasionally, and
can be considered to be vector spaces over different fields. It is then necessary to specify which of these ground fields is being used in the definition of "linear". If
and
are considered as spaces over the field
as above, we talk about
-linear maps. For example, the conjugation of complex numbers is an
-linear map
, but it is not
-linear.
A linear map with
viewed as a vector space over itself is called a linear functional.[5]
These statements generalize to any left-module over a ring
without modification, and to any right-module upon reversing of the scalar multiplication.
https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Graphs
A {\displaystyle A} ,它的特征向量(eigenvector,也译固有向量或本征向量) v {\displaystyle v}
经过这个线性变换[1]之后,得到的新向量仍然与原来的 v {\displaystyle v}
保持在同一条直线上,但其长度或方向也许会改变。即
A {\displaystyle A} ,它的特征向量(eigenvector,也译固有向量或本征向量) v {\displaystyle v}
经过这个线性变换[1]之后,得到的新向量仍然与原来的 v {\displaystyle v}
保持在同一条直线上,但其长度或方向也许会改变。即
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it. More formally, if T is a linear transformation from a vector space V over a field F into itself and v is a vector in V that is not the zero vector, then v is an eigenvector of T if T(v) is a scalar multiple of v. This condition can be written as the equation
- T ( v ) = λ v , {\displaystyle T(\mathbf {v} )=\lambda \mathbf {v} ,}
where λ is a scalar in the field F, known as the eigenvalue, characteristic value, or characteristic root associated with the eigenvector v.
If the vector space V is finite-dimensional, then the linear transformation T can be represented as a square matrix A, and the vector v by a column vector, rendering the above mapping as a matrix multiplication on the left hand side and a scaling of the column vector on the right hand side in the equation
- A v = λ v . {\displaystyle A\mathbf {v} =\lambda \mathbf {v} .}
There is a correspondence between n by n square matrices and linear transformations from an n-dimensional vector space to itself. For this reason, it is equivalent to define eigenvalues and eigenvectors using either the language of matrices or the language of linear transformations.[1][2]
Geometrically an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction that is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.[3]
math.mit.edu/~gs/linearalgebra/ila0601.pdf
A100 was found by using the eigenvalues of A, not by multiplying 100 matrices.
- A v = λ v {\displaystyle Av=\lambda v}
,
λ {\displaystyle \lambda } 为标量,即特征向量的长度在该线性变换下缩放的比例,称 λ {\displaystyle \lambda }
为其特征值(本征值)。如果特征值为正,则表示 v {\displaystyle v}
在经过线性变换的作用后方向也不变;如果特征值为负,说明方向会反转;如果特征值为0,则是表示缩回零点。但无论怎样,仍在同一条直线上。
- A v = λ v {\displaystyle Av=\lambda v}
,
λ {\displaystyle \lambda } 为标量,即特征向量的长度在该线性变换下缩放的比例,称 λ {\displaystyle \lambda }
为其特征值(本征值)。如果特征值为正,则表示 v {\displaystyle v}
在经过线性变换的作用后方向也不变;如果特征值为负,说明方向会反转;如果特征值为0,则是表示缩回零点。但无论怎样,仍在同一条直线上。
特征向量-Eigenvalues_and_eigenvectors#Graphs 线性变换的更多相关文章
- 特征向量-Eigenvalues_and_eigenvectors#Graphs
https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Graphs A {\displaystyle A} ...
- 知识图谱顶刊综述 - (2021年4月) A Survey on Knowledge Graphs: Representation, Acquisition, and Applications
知识图谱综述(2021.4) 论文地址:A Survey on Knowledge Graphs: Representation, Acquisition, and Applications 目录 知 ...
- paper 128:奇异值分解(SVD) --- 线性变换几何意义[转]
PS:一直以来对SVD分解似懂非懂,此文为译文,原文以细致的分析+大量的可视化图形演示了SVD的几何意义.能在有限的篇幅把这个问题讲解的如此清晰,实属不易.原文举了一个简单的图像处理问题,简单形象,真 ...
- 转载:奇异值分解(SVD) --- 线性变换几何意义(下)
本文转载自他人: PS:一直以来对SVD分解似懂非懂,此文为译文,原文以细致的分析+大量的可视化图形演示了SVD的几何意义.能在有限的篇幅把这个问题讲解的如此清晰,实属不易.原文举了一个简单的图像处理 ...
- 转载:奇异值分解(SVD) --- 线性变换几何意义(上)
本文转载自他人: PS:一直以来对SVD分解似懂非懂,此文为译文,原文以细致的分析+大量的可视化图形演示了SVD的几何意义.能在有限的篇幅把这个问题讲解的如此清晰,实属不易.原文举了一个简单的图像处理 ...
- Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
Defferrard, Michaël, Xavier Bresson, and Pierre Vandergheynst. "Convolutional neural networks o ...
- 论文阅读:Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs(2019 ACL)
基于Attention的知识图谱关系预测 论文地址 Abstract 关于知识库完成的研究(也称为关系预测)的任务越来越受关注.多项最新研究表明,基于卷积神经网络(CNN)的模型会生成更丰富,更具表达 ...
- 论文解读二代GCN《Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering》
Paper Information Title:Convolutional Neural Networks on Graphs with Fast Localized Spectral Filteri ...
- 论文解读《The Emerging Field of Signal Processing on Graphs》
感悟 看完图卷积一代.二代,深感图卷积的强大,刚开始接触图卷积的时候完全不懂为什么要使用拉普拉斯矩阵( $L=D-W$),主要是其背后的物理意义.通过借鉴前辈们的论文.博客.评论逐渐对图卷积有了一定的 ...
随机推荐
- [AX]AX2012 Interaction class
Ax2012 Client的form如果属性FormTemplate设置为DetailsPage或者ListPage,则必须同时设置属性InteractionClass为相应的Interaction类 ...
- PyCharm使用Github管理代码
本篇文章主要写一下如何通过pycharm客户端来上传下载代码. 安装Git 在Windows上使用Git,可以从Git官网直接下载安装程序,(网速慢的同学请移步国内镜像),然后按默认选项安装即可. 安 ...
- zabbix添加对tomcat线程池的监控
在zabbix模板中添加以下监控项: 可以参考文档:http://www.fblinux.com/?p=616
- Mybatis整合通用Dao,Mybatis整合通用Mapper,MyBatis3.x整合通用 Mapper3.5.x
Mybatis整合通用Dao,Mybatis整合通用Mapper,MyBatis3.x整合通用 Mapper3.5.x ============================== 蕃薯耀 2018年 ...
- yii 前端js动态添加验证规则
在使用 activeForm 生成表单及验证时,默认是按照 model 里的 rules 生成js验证,model 验证在加载完页面后生效,不可修改,如果需要扩展.动态验证,需要使用js来配合 直接上 ...
- Web暴力破解--前端JS表单加密进行爆破
0x01 前言 常见的js实现加密的方式有:md5.base64.shal,写了一个简单的demo作为测试. 0x02 代码 login.html <!DOCTYPE HTML> < ...
- 全屏加载loading显示的解决方法
step1:可以在网页里加一个div用来现实loading. <div id="loading"> <!--这里放你的loading时显示的动画或者文字--> ...
- Redis 操作列表数据
Redis 操作列表数据: > lpush list1 "aaa" // lpush 用于追加列表元素,默认追加到列表的最左侧(left) (integer) > lp ...
- [SublimeText] Sublime Text 2 运行 Python 脚本中文路径解决方法
在 SublimeText 中直接运行 Python 脚本,出现以下报错提示: Running python -u C:\Documents and Settings\Administrator\桌面 ...
- 国内CDN加速现状
什么是CDN CDN的全称是Content Delivery Network,即内容分发网络.是位于网络层与应用层之间的网络应用,其目的是通过在现有的Internet中增加一层新的网络架构,将网站的内 ...