{"title":"0.1 Introduction","authors":"K. Bergen, K. Chavez, A. Ioannidis, S. Schmit","doi":"10.1515/9783112402245-003","DOIUrl":"https://doi.org/10.1515/9783112402245-003","url":null,"abstract":"Consider a function F (w) that we seek to optimize, min w F (w), which is the sum of constituent functions, F (w) = ∑n i=1 fi(w). We will be assuming n is large, w ∈ Rd, and d fits in memory on a single machine. Now, we can calculate the gradient of F (w) as a simple sum of the gradients of the constituent fi(w) functions, ∇F (w) = ∑n i=1∇fi(w), which we can then compute in O(nd). For example, if we have a least squares objective, i.e. fi(w) = (x > i w−yi), then∇fi(w) = 2(wxi−yi)xi, which is just a re-weighting of the original vector fi(w).","PeriodicalId":188022,"journal":{"name":"Foreign Investment in the Sultanate of Oman","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130081426","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}