{"title":"Proceedings of a symposium on Compiler optimization","authors":"R. S. Northcote","doi":"10.1145/800028","DOIUrl":"https://doi.org/10.1145/800028","url":null,"abstract":"","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126643936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Compilation of optimized syntactic recognizers from Floyd-Evans productions","authors":"H. R. Haynes, L. Schutte","doi":"10.1145/800028.808483","DOIUrl":"https://doi.org/10.1145/800028.808483","url":null,"abstract":"Floyd-Evans productions are becoming increasingly popular as the metalanguage to be used in describing the syntactic analysis phase of programming language processors. Techniques for compiling optimized syntactic recognizers from Floyd-Evans productions are presented. Such recognizers promise to yield significant gains in recognition speed with no increase in storage requirements when compared to table-driven interpretive recognizers. The compiled recognizers can be described in terms of macros that are essentially machine-independent.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122715293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Global common subexpression elimination","authors":"J. Cocke","doi":"10.1145/800028.808480","DOIUrl":"https://doi.org/10.1145/800028.808480","url":null,"abstract":"When considering compiler optimization, there are two questions that immediately come to mind; one, why and to what extent is optimization necessary and two, to what extent is it possible. When considering the second question, one might immediately become discouraged since it is well known that the program equivalency problem is recursively unsolvable. It is, of course, clear from this that there will never be techniques for generating a completely optimum program. These unsolvability results, however, do not preclude the possibility of ad hoc techniques for program improvement or even a partial theory which produces a class of equivalent programs optimized in varying degrees. The reasons why optimization is required seem to me to fall in two major categories. The first I will call “local” and the second “global”.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134569581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effectiveness of an optimizing compiler for arithmetic expressions","authors":"James A. Painter","doi":"10.1145/800028.808487","DOIUrl":"https://doi.org/10.1145/800028.808487","url":null,"abstract":"This paper defines the notion of effectiveness of an optimizing compiler and presents a proof that a simple optimizing compiler is effective. An optimizing compiler typically consists of a basic compiler and a set of optimizations for special cases. The fundamental notion of effectiveness is that the basic compiler is correct, all of the optimization transformations preserve correctness, and produce essentially equivalent programs which have a smaller value relative to a specified weighting function.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131798182","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Local optimizations","authors":"J. T. Bagwell","doi":"10.1145/800028.808484","DOIUrl":"https://doi.org/10.1145/800028.808484","url":null,"abstract":"Some local optimizations (as opposed to global optimizations} are presented and a suitable one-pass compiler design for using them is shown. Optimizations shown are divided into machine dependent and independent classes with examples of each. There is some discussion of a subscript calculation technique which is an improvement over the usual technique, and a discussion of the best way to raise a quantity to a known small constant power by inline code. Various register allocation criteria are mentioned.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116722557","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A formal approach to code optimization","authors":"A. Aho, R. Sethi, J. Ullman","doi":"10.1145/800028.808486","DOIUrl":"https://doi.org/10.1145/800028.808486","url":null,"abstract":"We examine from a formal point of view some problems which arise in code optimization and present some of the results which can come from such an approach. Specifically, a set of transformations which characterize optimization algorithms for straight line code is presented. Then we present an algorithm which produces machine code for evaluating arithmetic expressions on machines with N ≥ 1 general purpose registers. We can prove that this algorithm produces optimal code when the cost criterion is the length of machine code generated.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127804426","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Control flow analysis","authors":"F. Allen","doi":"10.1145/800028.808479","DOIUrl":"https://doi.org/10.1145/800028.808479","url":null,"abstract":"Any static, global analysis of the expression and data relationships in a program requires a knowledge of the control flow of the program. Since one of the primary reasons for doing such a global analysis in a compiler is to produce optimized programs, control flow analysis has been embedded in many compilers and has been described in several papers. An early paper by Prosser [5] described the use of Boolean matrices (or, more particularly, connectivity matrices) in flow analysis. The use of “dominance” relationships in flow analysis was first introduced by Prosser and much expanded by Lowry and Medlock [6]. References [6,8,9] describe compilers which use various forms of control flow analysis for optimization. Some recent developments in the area are reported in [4] and in [7]. The underlying motivation in all the different types of control flow analysis is the need to codify the flow relationships in the program. The codification may be in connectivity matrices, in predecessor-successor tables, in dominance lists, etc. Whatever the form, the purpose is to facilitate determining what the flow relationships are; in other words to facilitate answering such questions as: is this an inner loop?, if an expression is removed from the loop where can it be correctly and profitably placed?, which variable definitions can affect this use? In this paper the basic control flow relationships are expressed in a directed graph. Various graph constructs are then found and shown to codify interesting global relationships.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134060908","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimization problems in extensible compilers","authors":"Bruce J. Bolas","doi":"10.1145/390013.808488","DOIUrl":"https://doi.org/10.1145/390013.808488","url":null,"abstract":"Some observations are made about the special problems of optimization in compilers for extensible languages. Different kinds of extension are defined, and their impact on the processes of compiler optimization are briefly discussed. The problems of achieving good optimization for such languages are related to the problems of formalizing the definition of programming languages in general.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128434898","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimization aspects of compiler- compilers","authors":"T. Cheatham, Thomas A. Standish","doi":"10.1145/800028.808482","DOIUrl":"https://doi.org/10.1145/800028.808482","url":null,"abstract":"A decade of experience with prototype versions of compiler-compilers, some of which have been successful and some of which have not been so successful, leads us to the conclusion that we can now engineer good compiler-compilers which can generate efficient compilers that generate efficient object code. This paper reviews the experience with compiler-compilers, documents reasons for believing that some compiler-compilers are good, and advocates the importation of compiler-compiler techniques by commercial firms and the production of well-engineered compiler-compilers as commercial products. Some unanswered questions about compiler-compiler techniques are explored in relation to the newly emerging discipline of software engineering.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"199 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116578319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design characteristics of the WATFOR compiler","authors":"D. Cowan, J. W. Graham","doi":"10.1145/800028.808481","DOIUrl":"https://doi.org/10.1145/800028.808481","url":null,"abstract":"About 1960, it became fashionable to introduce computer techniques into many of the courses being taught at the university level. These courses tended to be technically oriented (Engineering, Science, Mathematics), and the language most often used was one of the versions of FORTRAN. Students were introduced to computing by a brief course in FORTRAN, and then were expected to apply their newly-discovered knowledge to the solution of numerous problems related to some discipline. Introducing large numbers of students to computing in this manner created an entirely new type of demand for computer services. These new demands for computer services had to satisfy the following needs. (i) The programmers were not professionals; thus, the proportion of errors in a given number of written statements was higher than usual. (ii) The programs themselves were often quite short, usually 30 to 50 statements in length. (iii) The volume of submitted programs was very high, in the order of hundreds of thousands per day. (iv) The debugged program tended to be run in production only once, and was set aside as a completed assignment.","PeriodicalId":399752,"journal":{"name":"Proceedings of a symposium on Compiler optimization","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1970-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126987621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}