{"title":"A Superposition-Based Calculus for Diagrammatic Reasoning","authors":"R. Echahed, M. Echenim, M. Mhalla, N. Peltier","doi":"10.1145/3479394.3479405","DOIUrl":"https://doi.org/10.1145/3479394.3479405","url":null,"abstract":"We introduce a class of rooted graphs which are expressive enough to encode various kinds of classical or quantum circuits. We then follow a set-theoretic approach to define rewrite systems over the considered graphs. Afterwards, we tackle the problem of equational reasoning with the graphs under study and we propose a new Superposition calculus to check the unsatisfiability of formulas consisting of equations or disequations over these graphs. We establish the soundness and refutational completeness of the calculus.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121490255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Deep Semantic Versioning for Evolution and Variability","authors":"Luís Carvalho, João Costa Seco","doi":"10.1145/3479394.3479416","DOIUrl":"https://doi.org/10.1145/3479394.3479416","url":null,"abstract":"The development cycles in the software industry are shrinking due to the increasing demands for shorter time to market and the incremental development style of agile methodologies. Pragmatic software engineering approaches rely on careful product management, a strong versioning discipline, and a feature development strategy to avoid that newly merged code disrupts existing systems. Versioning is critical when managing software product lines and ensuring that all their variants are kept in operation amidst all the performed changes. Such methodologies enforce functional correctness through strong efforts in regression testing, with the associated exponential growth in complexity. In this paper, we propose a language-based approach to software versioning. Unlike the traditional approach of mainstream VCS, where each evolution step is represented by a textual diff, we treat versions as first-class citizens. Each evolution step, merge operation, and version relationship, is represented as code in the program. We extend prior work, Versioned Featherweight Java, to support a full-fledged version control system. First, we introduce multi-branching and merge operations, which allow for more advanced workflows. We also present a slicing procedure implemented in a compile-time tool that extracts well-typed Featherweight Java code for any single version out of a versioned codebase. We present formal soundness results that ensure that the sliced code for any version is well-behaved and has the same behaviour as the multi-version source code. We believe that our developments effectively model relevant domains of software evolution, such as feature-oriented programming, software product lines, and continuous delivery scenarios. By lifting the versioning aspect, usually represented by text diffs, to the language level, we pave the way for tools that interact with software repositories (e.g. CI/CD, GitHub Actions) to have more insight regarding the evolution of the software semantics.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133462786","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Confluence in Non-Left-Linear Untyped Higher-Order Rewrite Theories","authors":"Gaspard Férey, J. Jouannaud","doi":"10.1145/3479394.3479403","DOIUrl":"https://doi.org/10.1145/3479394.3479403","url":null,"abstract":"We develop techniques based on van Oostrom’s decreasing diagrams that reduce confluence proofs to the checking of critical pairs for higher-order rewrite rules extending beta-reduction on pure lambda-terms. We show that confluence is preserved for a large subset of terms that contains all pure lambda terms. Our results are applied to famous Klop’s examples of non-confluent behaviours in presence of convergent rewrite rules and to fragments of various encodings, in a dependent type theory with rewrite rules, of the Calculus of Constructions with polymorphic universes.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133633377","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Trimming Data Sets: a Verified Algorithm for Robust Mean Estimation","authors":"Ieva Daukantas, A. Bruni, C. Schürmann","doi":"10.1145/3479394.3479412","DOIUrl":"https://doi.org/10.1145/3479394.3479412","url":null,"abstract":"The operation of trimming data sets is heavily used in AI systems. Trimming is useful to make AI systems more robust against adversarial or common perturbations. At the core of robust AI systems lies the concept that outliers in a data set occur with low probability, and therefore can be discarded with little loss of precision in the result. The statistical argument that formalizes this concept of robustness is based on an extension of the Chebyshev’s inequality first proposed by Tukey in 1960. In this paper we present a mechanized proof of robustness of the trimmed mean algorithm, which is a statistical method underlying many complex applications of deep learning. For this purpose we use the Coq proof assistant to formalize Tukey’s extension to Chebyshev’s inequality, which allows us to verify the robustness of the trimmed mean algorithm. Our contribution shows the viability of mechanized robustness arguments for algorithms that are at the foundation of complex AI systems.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122809951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automated Trust Analysis of Copland Specifications for Layered Attestations✱","authors":"Paul D. Rowe, John D. Ramsdell, Ian D. Kretz","doi":"10.1145/3479394.3479418","DOIUrl":"https://doi.org/10.1145/3479394.3479418","url":null,"abstract":"In distributed systems, trust decisions are often based on remote attestations in which evidence is gathered about the integrity of subcomponents. Layered attestations leverage hierarchical dependencies among the subcomponents to bolster the trustworthiness of evidence. Copland is a declarative, domain-specific language for specifying complex layered attestations. How phrases are composed bears directly on the trustworthiness of the evidence they produce, and complex phrases become quite difficult to analyze by hand. We introduce an automated method for analyzing executions of attestations specified by Copland phrases in an adversarial setting. We develop a general theory of executions with adversarial corruption and repair events. Our approach is to enrich the Copland semantics according to this theory. Using the model finder Chase, we characterize all executions consistent with a set of initial assumptions. From this set of models, an analyst can discover all ways an active adversary can corrupt subcomponents without being detected by the attestation. These efforts afford trust policymakers the ability to compare attestations expressed as Copland phrases against trust policy in a way that encompasses both static and runtime concerns.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"48 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133578240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"From Non-determinism to Goroutines: A Fair Implementation of Curry in Go","authors":"Jonas Böhm, M. Hanus, Finn Teegen","doi":"10.1145/3479394.3479411","DOIUrl":"https://doi.org/10.1145/3479394.3479411","url":null,"abstract":"The declarative programming language Curry amalgamates demand-driven evaluation from functional programming with non-determinism from logic programming. In contrast to Prolog, the search strategy for non-deterministic computations is not fixed so that complete or parallel strategies are reasonable for Curry. In particular, a desirable option is a fair strategy which frees the programmer from considering the influence of the search strategy to the success of a computation. In this paper we describe an implementation with this property. Based on recent developments on operational models for functional logic programming, we present a new implementation which transforms Curry programs in several transformation steps into Go programs. By exploiting lightweight threads in the form of goroutines, we obtain a complete and fair implementation which automatically uses multi-processing to speed up non-deterministic computations. This has the effect that, in some cases, non-deterministic algorithms are more efficiently evaluated than deterministic ones.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"7 7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132354658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Derived Reasonable Abstract Machine for Strong Call by Value","authors":"Małgorzata Biernacka, Witold Charatonik, T. Dráb","doi":"10.1145/3479394.3479401","DOIUrl":"https://doi.org/10.1145/3479394.3479401","url":null,"abstract":"We present an efficient implementation of the full-reducing call-by-value strategy for the pure λ-calculus in the form of an abstract machine. The presented machine has been systematically derived using Danvy et al.’s functional correspondence that connects higher-order interpreters with abstract-machine models by a well-established transformation technique. It improves on a previously presented machine by Biernacka et al. in terms of efficiency: the new machine simulates β-reduction with the overhead polynomial in the number of β-steps and in the size of the initial term. Thus, the machine makes a “reasonable” (in the sense of Accattoli et al.) implementation of Strong CbV. We prove correctness and reasonability of the machine. The latter property is shown using a form of amortized cost analysis à la Okasaki.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"251 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121242904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Intersection types for a λ-calculus with global store","authors":"Ugo de'Liguoro, R. Treglia","doi":"10.1145/3479394.3479400","DOIUrl":"https://doi.org/10.1145/3479394.3479400","url":null,"abstract":"We study the semantics of an untyped λ-calculus equipped with operators representing read and write operations from and to a global store. We adopt the monadic approach to model side effects and treat read and write as algebraic operations over a monad. We introduce an operational semantics and a type assignment system of intersection types, and prove that types are invariant under reduction and expansion of term and state configurations, and characterize convergent terms via their typings.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"131 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124145800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dariusz Biernacki, Mateusz Pyzik, Filip Sieczkowski
{"title":"Reflecting Stacked Continuations in a Fine-Grained Direct-Style Reduction Theory","authors":"Dariusz Biernacki, Mateusz Pyzik, Filip Sieczkowski","doi":"10.1145/3479394.3479399","DOIUrl":"https://doi.org/10.1145/3479394.3479399","url":null,"abstract":"The delimited-control operator shift0 has been formally shown to capture the operational semantics of deep handlers for algebraic effects. Its CPS translation generates λ-terms in which continuation composition is not expressed in terms of nested function calls, as is typical of other delimited-control operators, e.g. shift, but with function applications consuming a sequence of continuations one at a time, as if they formed a stack. We present a novel reduction theory for Moggi’s computational λ-calculus extended with shift0 and a control delimiter dollar, which models the capture of evaluation contexts in a fine-grained manner as an interaction between the let-expressions and the delimiter. We establish a connection between our reduction theory and the existing theories of shif0 and dollar. Moreover, we develop a CPS translation for our calculus along with a direct-style translation that together form a reflection, i.e. the translations preserve reductions and the direct-style translation is a right inverse of the CPS translation. This construction relies on the invariant that CPS root terms are in η-head-normal form. The results of this work could potentially be used for compiler optimisations and lead to a similar development for algebraic effects.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129080605","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Zhao, Pavle Subotic, Mukund Raghothaman, Bernhard Scholz
{"title":"Towards Elastic Incrementalization for Datalog","authors":"David Zhao, Pavle Subotic, Mukund Raghothaman, Bernhard Scholz","doi":"10.1145/3479394.3479415","DOIUrl":"https://doi.org/10.1145/3479394.3479415","url":null,"abstract":"Various incremental evaluation strategies for Datalog have been developed that reuse computations for small input changes. These methods assume that incrementalization is always a better strategy than recomputation. However, in real-world applications such as static program analysis, recomputation can be cheaper than incrementalization for large updates. This work introduces an elastic incremental approach with two strategies that can be selected based on the impact of the input change. The first strategy is a Bootstrap strategy that recomputes the entire result for high-impact changes. The second is an Update strategy that performs an incremental update for low-impact changes. Our approach allows for a lightweight Bootstrap strategy suitable for high-impact changes, with the trade-off that Update may require more work for small changes. We demonstrate our approach using real-world applications and compare our elastic incremental approach to existing methods.","PeriodicalId":242361,"journal":{"name":"23rd International Symposium on Principles and Practice of Declarative Programming","volume":"134 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115828205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}