{"title":"正交的胜利","authors":"G. Strang","doi":"10.23919/spa50552.2020.9241274","DOIUrl":null,"url":null,"abstract":"The equation $A x=0$ tells us that x is perpendicular to every row of $A-$ and therefore to the whole row space of A. This is fundamental, but singular vectors v in the row space do more. (1) the v’s are orthogonal (2) the vectors $u=A v$ are also orthogonal (in the column space of A). Those v’s and u’s are columns in the singular value decomposition $A V=U \\Sigma$. They are eigenvectors of $A^{T} A$ and $A A^{T}$, perfect for applications. We can list 10 reasons why orthogonal matrices like U and V are best for computation - and also for understanding. Fortunately the product of orthogonal matrices $V_{I} V_{2}$ is also an orthogonal matrix. As long as our measure of length is $\\|v\\|^{2}=v_{1}^{2}+\\ldots+v_{n}^{2}$, orthogonal vectors and orthogonal matrices will win.","PeriodicalId":157578,"journal":{"name":"2020 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA)","volume":"349 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Victory of Orthogonality\",\"authors\":\"G. Strang\",\"doi\":\"10.23919/spa50552.2020.9241274\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The equation $A x=0$ tells us that x is perpendicular to every row of $A-$ and therefore to the whole row space of A. This is fundamental, but singular vectors v in the row space do more. (1) the v’s are orthogonal (2) the vectors $u=A v$ are also orthogonal (in the column space of A). Those v’s and u’s are columns in the singular value decomposition $A V=U \\\\Sigma$. They are eigenvectors of $A^{T} A$ and $A A^{T}$, perfect for applications. We can list 10 reasons why orthogonal matrices like U and V are best for computation - and also for understanding. Fortunately the product of orthogonal matrices $V_{I} V_{2}$ is also an orthogonal matrix. As long as our measure of length is $\\\\|v\\\\|^{2}=v_{1}^{2}+\\\\ldots+v_{n}^{2}$, orthogonal vectors and orthogonal matrices will win.\",\"PeriodicalId\":157578,\"journal\":{\"name\":\"2020 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA)\",\"volume\":\"349 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.23919/spa50552.2020.9241274\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/spa50552.2020.9241274","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
方程A x=0告诉我们x垂直于A-的每一行因此也垂直于A的整个行空间这是基本的,但是行空间中的奇异向量v做得更多。(1) v是正交的(2)向量$u=A v$也是正交的(在A的列空间中)。这些v和u是奇异值分解$A v = u \Sigma$中的列。它们是$A^{T} A$和$A^{T} $的特征向量,非常适合应用。我们可以列出10个原因,为什么像U和V这样的正交矩阵最适合计算,也最适合理解。幸运的是,正交矩阵的乘积V_{I} V_{2}$也是一个正交矩阵。只要我们的长度度量是$\|v\|^{2}=v_{1}^{2}+\ldots+v_{n}^{2}$,那么正交向量和正交矩阵就会胜出。
The equation $A x=0$ tells us that x is perpendicular to every row of $A-$ and therefore to the whole row space of A. This is fundamental, but singular vectors v in the row space do more. (1) the v’s are orthogonal (2) the vectors $u=A v$ are also orthogonal (in the column space of A). Those v’s and u’s are columns in the singular value decomposition $A V=U \Sigma$. They are eigenvectors of $A^{T} A$ and $A A^{T}$, perfect for applications. We can list 10 reasons why orthogonal matrices like U and V are best for computation - and also for understanding. Fortunately the product of orthogonal matrices $V_{I} V_{2}$ is also an orthogonal matrix. As long as our measure of length is $\|v\|^{2}=v_{1}^{2}+\ldots+v_{n}^{2}$, orthogonal vectors and orthogonal matrices will win.