{"title":"0.1 Introduction","authors":"K. Bergen, K. Chavez, A. Ioannidis, S. Schmit","doi":"10.1515/9783112402245-003","DOIUrl":null,"url":null,"abstract":"Consider a function F (w) that we seek to optimize, min w F (w), which is the sum of constituent functions, F (w) = ∑n i=1 fi(w). We will be assuming n is large, w ∈ Rd, and d fits in memory on a single machine. Now, we can calculate the gradient of F (w) as a simple sum of the gradients of the constituent fi(w) functions, ∇F (w) = ∑n i=1∇fi(w), which we can then compute in O(nd). For example, if we have a least squares objective, i.e. fi(w) = (x > i w−yi), then∇fi(w) = 2(wxi−yi)xi, which is just a re-weighting of the original vector fi(w).","PeriodicalId":188022,"journal":{"name":"Foreign Investment in the Sultanate of Oman","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foreign Investment in the Sultanate of Oman","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/9783112402245-003","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Consider a function F (w) that we seek to optimize, min w F (w), which is the sum of constituent functions, F (w) = ∑n i=1 fi(w). We will be assuming n is large, w ∈ Rd, and d fits in memory on a single machine. Now, we can calculate the gradient of F (w) as a simple sum of the gradients of the constituent fi(w) functions, ∇F (w) = ∑n i=1∇fi(w), which we can then compute in O(nd). For example, if we have a least squares objective, i.e. fi(w) = (x > i w−yi), then∇fi(w) = 2(wxi−yi)xi, which is just a re-weighting of the original vector fi(w).