{"title":"Modular Construction of Shape-Numeric Analyzers","authors":"B. E. Chang, Xavier Rival","doi":"10.4204/EPTCS.129.11","DOIUrl":"https://doi.org/10.4204/EPTCS.129.11","url":null,"abstract":"The aim of static analysis is to infer invariants about programs that are precise enough to establish semantic properties, such as the absence of run-time errors. Broadly speaking, there are two major branches of static analysis for imperative programs. Pointer and shape analyses focus on inferring properties of pointers, dynamically-allocated memory, and recursive data structures, while numeric analyses seek to derive invariants on numeric values. Although simultaneous inference of shape-numeric invariants is often needed, this case is especially challenging and is not particularly well explored. Notably, simultaneous shape-numeric inference raises complex issues in the design of the static analyzer itself. In this paper, we study the construction of such shape-numeric, static analyzers. We set up an abstract interpretation framework that allows us to reason about simultaneous shape-numeric properties by combining shape and numeric abstractions into a modular, expressive abstract domain. Such a modular structure is highly desirable to make its formalization and implementation easier to do and get correct. To achieve this, we choose a concrete semantics that can be abstracted step-by-step, while preserving a high level of expressiveness. The structure of abstract operations (i.e., transfer, join, and comparison) follows the structure of this semantics. The advantage of this construction is to divide the analyzer in modules and functors that implement abstractions of distinct features.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124254092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Higher-Order Process Modeling: Product-Lining, Variability Modeling and Beyond","authors":"Johannes Neubauer, B. Steffen, T. Margaria","doi":"10.4204/EPTCS.129.16","DOIUrl":"https://doi.org/10.4204/EPTCS.129.16","url":null,"abstract":"We present a graphical and dynamic framework for binding and execution of business) process models. It is tailored to integrate 1) ad hoc processes modeled graphically, 2) third party services discovered in the (Inter)net, and 3) (dynamically) synthesized process chains that solve situation-specific tasks, with the synthesis taking place not only at design time, but also at runtime. Key to our approach is the introduction of type-safe stacked second-order execution contexts that allow for higher-order process modeling. Tamed by our underlying strict service-oriented notion of abstraction, this approach is tailored also to be used by application experts with little technical knowledge: users can select, modify, construct and then pass (component) processes during process execution as if they were data. We illustrate the impact and essence of our framework along a concrete, realistic (business) process modeling scenario: the development of Springer's browser-based Online Conference Service (OCS). The most advanced feature of our new framework allows one to combine online synthesis with the integration of the synthesized process into the running application. This ability leads to a particularly flexible way of implementing self-adaption, and to a particularly concise and powerful way of achieving variability not only at design time, but also at runtime.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"07 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116495175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Online partial evaluation of sheet-defined functions","authors":"P. Sestoft","doi":"10.4204/EPTCS.129.10","DOIUrl":"https://doi.org/10.4204/EPTCS.129.10","url":null,"abstract":"We present a spreadsheet implementation, extended with sheet-defined functions, that allows users to define functions using only standard spreadsheet concepts such as cells, formulas and references, requiring no new syntax. This implements an idea proposed by Peyton-Jones and others [13]. As the main contribution of this paper, we then show how to add an online partial evaluator for such sheet-defined functions. The result is a higher-order functional language that is dynamically typed, in keeping with spreadsheet traditions, and an interactive platform for function definition and function specialization. We describe an implementation of these ideas, present some performance data from microbenchmarks, and outline desirable improvements and extensions.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128054577","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Proof System with Names for Modal Mu-calculus","authors":"C. Stirling","doi":"10.4204/EPTCS.129.2","DOIUrl":"https://doi.org/10.4204/EPTCS.129.2","url":null,"abstract":"Fixpoints are an important ingredient in semantics, abstract interpretation and program logics. Their addition to a logic can add considerable expressive power. One general issue is how to define proof systems for such logics. Here we examine proof systems for modal logic with fixpoints. We present a tableau proof system for checking validity of formulas which uses names to keep track of unfoldings of fixpoint variables as devised by Jungteerapanich.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123146631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. D. Angelis, F. Fioravanti, A. Pettorossi, M. Proietti
{"title":"Verification of Imperative Programs by Constraint Logic Program Transformation","authors":"E. D. Angelis, F. Fioravanti, A. Pettorossi, M. Proietti","doi":"10.4204/EPTCS.129.12","DOIUrl":"https://doi.org/10.4204/EPTCS.129.12","url":null,"abstract":"We present a method for verifying partial correctness properties of imperative programs that manipulate integers and arrays by using techniques based on the transformation of constraint logic programs (CLP). We use CLP as a metalanguage for representing imperative programs, their executions, and their properties. First, we encode the correctne ss of an imperative program, say prog, as the negation of a predicate incorrect defined by a CLP program T . By construction, incorrect holds in the least model of T if and only if the execution of prog from an initial configuration eventually halts in an error configuration. Then, we apply to progra m T a sequence of transformations that preserve its least model semantics. These transformations are based on well-known transformation rules, such as unfolding and folding, guided by suitable transformation strategies, such as specialization and generalization. The objective of the transformations is to derive a new CLP program TransfT where the predicate incorrect is defined either by (i) the fact ‘ incorrect.’ (and in this case prog is not correct), or by (ii) the empty set of clauses (and in this case prog is correct). In the case where we derive a CLP program such that neither (i) nor (ii) holds, we iterate the transformation. Since the problem is undecidable, this process may not terminate. We show through examples that our method can be applied in a rather systematic way, and is amenable to automation by transferring to the field of program verification many techniques developed in the fiel d of program transformation.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121348981","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Swiss Pocket Knife for Computability","authors":"N. Jones","doi":"10.4204/EPTCS.129.1","DOIUrl":"https://doi.org/10.4204/EPTCS.129.1","url":null,"abstract":"This research is about operational- and complexity-oriented aspects of classical foundations of computability theory. The approach is to re-examine some classical theorems and constructions, but with new criteria for success that are natural from a programming language perspective. \u0000Three cornerstones of computability theory are the S-m-ntheorem; Turing's \"universal machine\"; and Kleene's second recursion theorem. In today's programming language parlance these are respectively partial evaluation, self-interpretation, and reflection. In retrospect it is fascinating that Kleene's 1938 proof is constructive; and in essence builds a self-reproducing program. \u0000Computability theory originated in the 1930s, long before the invention of computers and programs. Its emphasis was on delimiting the boundaries of computability. Some milestones include 1936 (Turing), 1938 (Kleene), 1967 (isomorphism of programming languages), 1985 (partial evaluation), 1989 (theory implementation), 1993 (efficient self-interpretation) and 2006 (term register machines). \u0000The \"Swiss pocket knife\" of the title is a programming language that allows efficient computer implementation of all three computability cornerstones, emphasising the third: Kleene's second recursion theorem. We describe experiments with a tree-based computational model aiming for both fast program generation and fast execution of the generated programs.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"77 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113962449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Simple Semantics and Static Analysis for Stack Inspection","authors":"A. Banerjee, D. Naumann","doi":"10.4204/EPTCS.129.17","DOIUrl":"https://doi.org/10.4204/EPTCS.129.17","url":null,"abstract":"The Java virtual machine and the .NET common language runtime feature an access control mechanism specified operationally in terms of run-time stack inspection. We give a denotational semantics in \"eager\" form, and show that it is equivalent to the \"lazy\" semantics using stack inspection. We give a static analysis of safety, i.e., the absence of security errors, that is simpler than previous proposals. We identify several program transformations that can be used to remove run-time checks. We give complete, detailed proofs for safety of the analysis and for the transformations, exploiting compositionality of the eager semantics.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114121991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Immune System: the ultimate fractionated cyber-physical system","authors":"C. Talcott","doi":"10.4204/EPTCS.129.18","DOIUrl":"https://doi.org/10.4204/EPTCS.129.18","url":null,"abstract":"In this little vision paper we analyze the human immune system from a computer science point of view with the aim of understanding the architecture and features that allow robust, effective behavior to emerge from local sensing and actions. We then recall the notion of fractionated cyber-physical systems, and compare and contrast this to the immune system. We conclude with some challenges.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126732362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A short note on Simulation and Abstraction","authors":"C. Hankin","doi":"10.4204/EPTCS.129.20","DOIUrl":"https://doi.org/10.4204/EPTCS.129.20","url":null,"abstract":"This short note is written in celebration of David Schmidt's sixtieth birthday. He has now been active in the program analysis research community for over thirty years and we have enjoyed many interactions with him. His work on characterising simulations between Kripke structures using Galois connections was particularly influential in our own work on using probabilistic abstract interpretation to study Larsen and Skou's notion of probabilistic bisimulation. We briefly review this work and discuss some recent applications of these ideas in a variety of different application areas.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130046288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Model Checking in Bits and Pieces","authors":"Kedar S. Namjoshi","doi":"10.4204/EPTCS.129.25","DOIUrl":"https://doi.org/10.4204/EPTCS.129.25","url":null,"abstract":"Fully automated verification of concurrent programs is a difficult problem, primarily because of state explosion: the exponential growth of a program state space with the number of its concurrently active components. It is natural to apply a divide and conquer strategy to ameliorate state explosion, by analyzing only a single component at a time. We show that this strategy leads to the notion of a \"split\" invariant, an assertion which is globally inductive, while being structured as the conjunction of a number of local, per-component invariants. This formulation is closely connected to the classical Owicki-Gries method and to Rely-Guarantee reasoning. We show how the division of an invariant into a number of pieces with limited scope makes it possible to apply new, localized forms of symmetry and abstraction to drastically simplify its computation. Split invariance also has interesting connections to parametric verification. A quantified invariant for a parametric system is a split invariant for every instance. We show how it is possible, in some cases, to invert this connection, and to automatically generalize from a split invariant for a small instance of a system to a quantified invariant which holds for the entire family of instances.","PeriodicalId":411813,"journal":{"name":"Festschrift for Dave Schmidt","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130592935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}