{"title":"The Victory of Orthogonality","authors":"G. Strang","doi":"10.23919/spa50552.2020.9241274","DOIUrl":null,"url":null,"abstract":"The equation $A x=0$ tells us that x is perpendicular to every row of $A-$ and therefore to the whole row space of A. This is fundamental, but singular vectors v in the row space do more. (1) the v’s are orthogonal (2) the vectors $u=A v$ are also orthogonal (in the column space of A). Those v’s and u’s are columns in the singular value decomposition $A V=U \\Sigma$. They are eigenvectors of $A^{T} A$ and $A A^{T}$, perfect for applications. We can list 10 reasons why orthogonal matrices like U and V are best for computation - and also for understanding. Fortunately the product of orthogonal matrices $V_{I} V_{2}$ is also an orthogonal matrix. As long as our measure of length is $\\|v\\|^{2}=v_{1}^{2}+\\ldots+v_{n}^{2}$, orthogonal vectors and orthogonal matrices will win.","PeriodicalId":157578,"journal":{"name":"2020 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA)","volume":"349 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/spa50552.2020.9241274","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The equation $A x=0$ tells us that x is perpendicular to every row of $A-$ and therefore to the whole row space of A. This is fundamental, but singular vectors v in the row space do more. (1) the v’s are orthogonal (2) the vectors $u=A v$ are also orthogonal (in the column space of A). Those v’s and u’s are columns in the singular value decomposition $A V=U \Sigma$. They are eigenvectors of $A^{T} A$ and $A A^{T}$, perfect for applications. We can list 10 reasons why orthogonal matrices like U and V are best for computation - and also for understanding. Fortunately the product of orthogonal matrices $V_{I} V_{2}$ is also an orthogonal matrix. As long as our measure of length is $\|v\|^{2}=v_{1}^{2}+\ldots+v_{n}^{2}$, orthogonal vectors and orthogonal matrices will win.