Volume 18, Issue 2

2022


1. Beyond Notations: Hygienic Macro Expansion for Theorem Proving Languages

Sebastian Ullrich ; Leonardo de Moura.
In interactive theorem provers (ITPs), extensible syntax is not only crucial to lower the cognitive burden of manipulating complex mathematical objects, but plays a critical role in developing reusable abstractions in libraries. Most ITPs support such extensions in the form of restrictive "syntax sugar" substitutions and other ad hoc mechanisms, which are too rudimentary to support many desirable abstractions. As a result, libraries are littered with unnecessary redundancy. Tactic languages in these systems are plagued by a seemingly unrelated issue: accidental name capture, which often produces unexpected and counterintuitive behavior. We take ideas from the Scheme family of programming languages and solve these two problems simultaneously by proposing a novel hygienic macro system custom-built for ITPs. We further describe how our approach can be extended to cover type-directed macro expansion resulting in a single, uniform system offering multiple abstraction levels that range from supporting simplest syntax sugars to elaboration of formerly baked-in syntax. We have implemented our new macro system and integrated it into the new version of the Lean theorem prover, Lean 4. Despite its expressivity, the macro system is simple enough that it can easily be integrated into other systems.

2. Uniform Interpolants in EUF: Algorithms using DAG-representations

Silvio Ghilardi ; Alessandro Gianola ; Deepak Kapur.
The concept of uniform interpolant for a quantifier-free formula from a given formula with a list of symbols, while well-known in the logic literature, has been unknown to the formal methods and automated reasoning community for a long time. This concept is precisely defined. Two algorithms for computing quantifier-free uniform interpolants in the theory of equality over uninterpreted symbols (EUF) endowed with a list of symbols to be eliminated are proposed. The first algorithm is non-deterministic and generates a uniform interpolant expressed as a disjunction of conjunctions of literals, whereas the second algorithm gives a compact representation of a uniform interpolant as a conjunction of Horn clauses. Both algorithms exploit efficient dedicated DAG representations of terms. Correctness and completeness proofs are supplied, using arguments combining rewrite techniques with model theory.

3. A Flexible Proof Format for SAT Solver-Elaborator Communication

Seulkee Baek ; Mario Carneiro ; Marijn J. H. Heule.
We introduce FRAT, a new proof format for unsatisfiable SAT problems, and its associated toolchain. Compared to DRAT, the FRAT format allows solvers to include more information in proofs to reduce the computational cost of subsequent elaboration to LRAT. The format is easy to parse forward and backward, and it is extensible to future proof methods. The provision of optional proof steps allows SAT solver developers to balance implementation effort against elaboration time, with little to no overhead on solver time. We benchmark our FRAT toolchain against a comparable DRAT toolchain and confirm >84% median reduction in elaboration time and >94% median decrease in peak memory usage.

4. Gluing resource proof-structures: inhabitation and inverting the Taylor expansion

Giulio Guerrieri ; Luc Pellissier ; Lorenzo Tortora de Falco.
A Multiplicative-Exponential Linear Logic (MELL) proof-structure can be expanded into a set of resource proof-structures: its Taylor expansion. We introduce a new criterion characterizing (and deciding in the finite case) those sets of resource proof-structures that are part of the Taylor expansion of some MELL proof-structure, through a rewriting system acting both on resource and MELL proof-structures. We also prove semi-decidability of the type inhabitation problem for cut-free MELL proof-structures.

5. Probabilistic Rewriting and Asymptotic Behaviour: on Termination and Unique Normal Forms

Claudia Faggian.
While a mature body of work supports the study of rewriting systems, abstract tools for Probabilistic Rewriting are still limited. In this paper we study the question of uniqueness of the result (unique limit distribution), and develop a set of proof techniques to analyze and compare reduction strategies. The goal is to have tools to support the operational analysis of probabilistic calculi (such as probabilistic lambda-calculi) where evaluation allows for different reduction choices (hence different reduction paths).

6. A Near-Optimal Parallel Algorithm for Joining Binary Relations

Bas Ketsman ; Dan Suciu ; Yufei Tao.
We present a constant-round algorithm in the massively parallel computation (MPC) model for evaluating a natural join where every input relation has two attributes. Our algorithm achieves a load of $\tilde{O}(m/p^{1/\rho})$ where $m$ is the total size of the input relations, $p$ is the number of machines, $\rho$ is the join's fractional edge covering number, and $\tilde{O}(.)$ hides a polylogarithmic factor. The load matches a known lower bound up to a polylogarithmic factor. At the core of the proposed algorithm is a new theorem (which we name the "isolated cartesian product theorem") that provides fresh insight into the problem's mathematical structure. Our result implies that the subgraph enumeration problem, where the goal is to report all the occurrences of a constant-sized subgraph pattern, can be settled optimally (up to a polylogarithmic factor) in the MPC model.

7. Enumerating Answers to First-Order Queries over Databases of Low Degree

Arnaud Durand ; Nicole Schweikardt ; Luc Segoufin.
A class of relational databases has low degree if for all $\delta>0$, all but finitely many databases in the class have degree at most $n^{\delta}$, where $n$ is the size of the database. Typical examples are databases of bounded degree or of degree bounded by $\log n$. It is known that over a class of databases having low degree, first-order boolean queries can be checked in pseudo-linear time, i.e.\ for all $\epsilon>0$ in time bounded by $n^{1+\epsilon}$. We generalize this result by considering query evaluation. We show that counting the number of answers to a query can be done in pseudo-linear time and that after a pseudo-linear time preprocessing we can test in constant time whether a given tuple is a solution to a query or enumerate the answers to a query with constant delay.

8. Circular Proofs as Session-Typed Processes: A Local Validity Condition

Farzaneh Derakhshan ; Frank Pfenning.
Proof theory provides a foundation for studying and reasoning about programming languages, most directly based on the well-known Curry-Howard isomorphism between intuitionistic logic and the typed lambda-calculus. More recently, a correspondence between intuitionistic linear logic and the session-typed pi-calculus has been discovered. In this paper, we establish an extension of the latter correspondence for a fragment of substructural logic with least and greatest fixed points. We describe the computational interpretation of the resulting infinitary proof system as session-typed processes, and provide an effectively decidable local criterion to recognize mutually recursive processes corresponding to valid circular proofs as introduced by Fortier and Santocanale. We show that our algorithm imposes a stricter requirement than Fortier and Santocanale's guard condition, but is local and compositional and therefore more suitable as the basis for a programming language.

9. Determinisability of register and timed automata

Lorenzo Clemente ; Sławomir Lasota ; Radosław Piórkowski.
The deterministic membership problem for timed automata asks whether the timed language given by a nondeterministic timed automaton can be recognised by a deterministic timed automaton. An analogous problem can be stated in the setting of register automata. We draw the complete decidability/complexity landscape of the deterministic membership problem, in the setting of both register and timed automata. For register automata, we prove that the deterministic membership problem is decidable when the input automaton is a nondeterministic one-register automaton (possibly with epsilon transitions) and the number of registers of the output deterministic register automaton is fixed. This is optimal: We show that in all the other cases the problem is undecidable, i.e., when either (1) the input nondeterministic automaton has two registers or more (even without epsilon transitions), or (2) it uses guessing, or (3) the number of registers of the output deterministic automaton is not fixed. The landscape for timed automata follows a similar pattern. We show that the problem is decidable when the input automaton is a one-clock nondeterministic timed automaton without epsilon transitions and the number of clocks of the output deterministic timed automaton is fixed. Again, this is optimal: We show that the problem in all the other cases is undecidable, i.e., when either (1) the input nondeterministic timed automaton has two clocks or more, or (2) it uses epsilon transitions, or (3) the […]

10. Formalizing the Face Lattice of Polyhedra

Xavier Allamigeon ; Ricardo D. Katz ; Pierre-Yves Strub.
Faces play a central role in the combinatorial and computational aspects of polyhedra. In this paper, we present the first formalization of faces of polyhedra in the proof assistant Coq. This builds on the formalization of a library providing the basic constructions and operations over polyhedra, including projections, convex hulls and images under linear maps. Moreover, we design a special mechanism which automatically introduces an appropriate representation of a polyhedron or a face, depending on the context of the proof. We demonstrate the usability of this approach by establishing some of the most important combinatorial properties of faces, namely that they constitute a family of graded atomistic and coatomistic lattices closed under interval sublattices. We also prove a theorem due to Balinski on the $d$-connectedness of the adjacency graph of polytopes of dimension $d$.

11. Tractable Combinations of Temporal CSPs

Manuel Bodirsky ; Johannes Greiner ; Jakub Rydval.
The constraint satisfaction problem (CSP) of a first-order theory T is the computational problem of deciding whether a given conjunction of atomic formulas is satisfiable in some model of T. We study the computational complexity of CSP$(T_1 \cup T_2)$ where $T_1$ and $T_2$ are theories with disjoint finite relational signatures. We prove that if $T_1$ and $T_2$ are the theories of temporal structures, i.e., structures where all relations have a first-order definition in $(Q;<)$, then CSP$(T_1 \cup T_2)$ is in P or NP-complete. To this end we prove a purely algebraic statement about the structure of the lattice of locally closed clones over the domain $Q$ that contain Aut$(Q;<)$.

12. CC-circuits and the expressive power of nilpotent algebras

Michael Kompatscher.
We show that CC-circuits of bounded depth have the same expressive power as circuits over finite nilpotent algebras from congruence modular varieties. We use this result to phrase and discuss a new algebraic version of Barrington, Straubing and Thérien's conjecture, which states that CC-circuits of bounded depth need exponential size to compute AND. Furthermore, we investigate the complexity of deciding identities and solving equations in a fixed nilpotent algebra. Under the assumption that the conjecture is true, we obtain quasipolynomial algorithms for both problems. On the other hand, if AND is computable by uniform CC-circuits of bounded depth and polynomial size, we can construct a nilpotent algebra in which checking identities is coNP-complete, and solving equations is NP-complete.

13. Specification and Verification of Timing Properties in Interoperable Medical Systems

Mahsa Zarneshan ; Fatemeh Ghassemi ; Ehsan Khamespanah ; Marjan Sirjani ; John Hatcliff.
To support the dynamic composition of various devices/apps into a medical system at point-of-care, a set of communication patterns to describe the communication needs of devices has been proposed. To address timing requirements, each pattern breaks common timing properties into finer ones that can be enforced locally by the components. Common timing requirements for the underlying communication substrate are derived from these local properties. The local properties of devices are assured by the vendors at the development time. Although organizations procure devices that are compatible in terms of their local properties and middleware, they may not operate as desired. The latency of the organization network interacts with the local properties of devices. To validate the interaction among the timing properties of components and the network, we formally specify such systems in Timed Rebeca. We use model checking to verify the derived timing requirements of the communication substrate in terms of the network and device models. We provide a set of templates as a guideline to specify medical systems in terms of the formal model of patterns. A composite medical system using several devices is subject to state-space explosion. We extend the reduction technique of Timed Rebeca based on the static properties of patterns. We prove that our reduction is sound and show the applicability of our approach in reducing the state space by modeling two clinical scenarios made of several […]

14. When Can We Answer Queries Using Result-Bounded Data Interfaces?

Antoine Amarilli ; Michael Benedikt.
We consider answering queries on data available through access methods, that provide lookup access to the tuples matching a given binding. Such interfaces are common on the Web; further, they often have bounds on how many results they can return, e.g., because of pagination or rate limits. We thus study result-bounded methods, which may return only a limited number of tuples. We study how to decide if a query is answerable using result-bounded methods, i.e., how to compute a plan that returns all answers to the query using the methods, assuming that the underlying data satisfies some integrity constraints. We first show how to reduce answerability to a query containment problem with constraints. Second, we show "schema simplification" theorems describing when and how result-bounded services can be used. Finally, we use these theorems to give decidability and complexity results about answerability for common constraint classes.

15. Quotients, inductive types, and quotient inductive types

Marcelo P. Fiore ; Andrew M. Pitts ; S. C. Steenkamp.
This paper introduces an expressive class of indexed quotient-inductive types, called QWI types, within the framework of constructive type theory. They are initial algebras for indexed families of equational theories with possibly infinitary operators and equations. We prove that QWI types can be derived from quotient types and inductive types in the type theory of toposes with natural number object and universes, provided those universes satisfy the Weakly Initial Set of Covers (WISC) axiom. We do so by constructing QWI types as colimits of a family of approximations to them defined by well-founded recursion over a suitable notion of size, whose definition involves the WISC axiom. We developed the proof and checked it using the Agda theorem prover.

16. Actris 2.0: Asynchronous Session-Type Based Reasoning in Separation Logic

Jonas Kastberg Hinrichsen ; Jesper Bengtson ; Robbert Krebbers.
Message passing is a useful abstraction for implementing concurrent programs. For real-world systems, however, it is often combined with other programming and concurrency paradigms, such as higher-order functions, mutable state, shared-memory concurrency, and locks. We present Actris: a logic for proving functional correctness of programs that use a combination of the aforementioned features. Actris combines the power of modern concurrent separation logics with a first-class protocol mechanism -- based on session types -- for reasoning about message passing in the presence of other concurrency paradigms. We show that Actris provides a suitable level of abstraction by proving functional correctness of a variety of examples, including a channel-based merge sort, a channel-based load-balancing mapper, and a variant of the map-reduce model, using concise specifications. While Actris was already presented in a conference paper (POPL'20), this paper expands the prior presentation significantly. Moreover, it extends Actris to Actris 2.0 with a notion of subprotocols -- based on session-type subtyping -- that permits additional flexibility when composing channel endpoints, and that takes full advantage of the asynchronous semantics of message passing in Actris. Soundness of Actris 2.0 is proven using a model of its protocol mechanism in the Iris framework. We have mechanised the theory of Actris, together with custom tactics, as well as all examples in the paper, in the Coq […]

17. Trakhtenbrot's Theorem in Coq: Finite Model Theory through the Constructive Lens

Dominik Kirst ; Dominique Larchey-Wendling.
We study finite first-order satisfiability (FSAT) in the constructive setting of dependent type theory. Employing synthetic accounts of enumerability and decidability, we give a full classification of FSAT depending on the first-order signature of non-logical symbols. On the one hand, our development focuses on Trakhtenbrot's theorem, stating that FSAT is undecidable as soon as the signature contains an at least binary relation symbol. Our proof proceeds by a many-one reduction chain starting from the Post correspondence problem. On the other hand, we establish the decidability of FSAT for monadic first-order logic, i.e. where the signature only contains at most unary function and relation symbols, as well as the enumerability of FSAT for arbitrary enumerable signatures. To showcase an application of Trakhtenbrot's theorem, we continue our reduction chain with a many-one reduction from FSAT to separation logic. All our results are mechanised in the framework of a growing Coq library of synthetic undecidability proofs.

18. Modular Termination for Second-Order Computation Rules and Application to Algebraic Effect Handlers

Makoto Hamana.
We present a new modular proof method of termination for second-order computation, and report its implementation SOL. The proof method is useful for proving termination of higher-order foundational calculi. To establish the method, we use a variation of semantic labelling translation and Blanqui's General Schema: a syntactic criterion of strong normalisation. As an application, we apply this method to show termination of a variant of call-by-push-value calculus with algebraic effects and effect handlers. We also show that our tool SOL is effective to solve higher-order termination problems.

19. Characteristic Logics for Behavioural Hemimetrics via Fuzzy Lax Extensions

Paul Wild ; Lutz Schröder.
In systems involving quantitative data, such as probabilistic, fuzzy, or metric systems, behavioural distances provide a more fine-grained comparison of states than two-valued notions of behavioural equivalence or behaviour inclusion. Like in the two-valued case, the wide variation found in system types creates a need for generic methods that apply to many system types at once. Approaches of this kind are emerging within the paradigm of universal coalgebra, based either on lifting pseudometrics along set functors or on lifting general real-valued (fuzzy) relations along functors by means of fuzzy lax extensions. An immediate benefit of the latter is that they allow bounding behavioural distance by means of fuzzy (bi-)simulations that need not themselves be hemi- or pseudometrics; this is analogous to classical simulations and bisimulations, which need not be preorders or equivalence relations, respectively. The known generic pseudometric liftings, specifically the generic Kantorovich and Wasserstein liftings, both can be extended to yield fuzzy lax extensions, using the fact that both are effectively given by a choice of quantitative modalities. Our central result then shows that in fact all fuzzy lax extensions are Kantorovich extensions for a suitable set of quantitative modalities, the so-called Moss modalities. For nonexpansive fuzzy lax extensions, this allows for the extraction of quantitative modal logics that characterize behavioural distance, i.e. satisfy a […]

20. The Shapley Value of Inconsistency Measures for Functional Dependencies

Ester Livshits ; Benny Kimelfeld.
Quantifying the inconsistency of a database is motivated by various goals including reliability estimation for new datasets and progress indication in data cleaning. Another goal is to attribute to individual tuples a level of responsibility to the overall inconsistency, and thereby prioritize tuples in the explanation or inspection of dirt. Therefore, inconsistency quantification and attribution have been a subject of much research in Knowledge Representation and, more recently, in Databases. As in many other fields, a conventional responsibility sharing mechanism is the Shapley value from cooperative game theory. In this paper, we carry out a systematic investigation of the complexity of the Shapley value in common inconsistency measures for functional-dependency (FD) violations. For several measures we establish a full classification of the FD sets into tractable and intractable classes with respect to Shapley-value computation. We also study the complexity of approximation in intractable cases.

21. The Theory of Traces for Systems with Nondeterminism, Probability, and Termination

Filippo Bonchi ; Ana Sokolova ; Valeria Vignudelli.
This paper studies trace-based equivalences for systems combining nondeterministic and probabilistic choices. We show how trace semantics for such processes can be recovered by instantiating a coalgebraic construction known as the generalised powerset construction. We characterise and compare the resulting semantics to known definitions of trace equivalences appearing in the literature. Most of our results are based on the exciting interplay between monads and their presentations via algebraic theories.

22. Asynchronous wreath product and cascade decompositions for concurrent behaviours

Bharat Adsul ; Paul Gastin ; Saptarshi Sarkar ; Pascal Weil.
We develop new algebraic tools to reason about concurrent behaviours modelled as languages of Mazurkiewicz traces and asynchronous automata. These tools reflect the distributed nature of traces and the underlying causality and concurrency between events, and can be said to support true concurrency. They generalize the tools that have been so efficient in understanding, classifying and reasoning about word languages. In particular, we introduce an asynchronous version of the wreath product operation and we describe the trace languages recognized by such products (the so-called asynchronous wreath product principle). We then propose a decomposition result for recognizable trace languages, analogous to the Krohn-Rhodes theorem, and we prove this decomposition result in the special case of acyclic architectures. Finally, we introduce and analyze two distributed automata-theoretic operations. One, the local cascade product, is a direct implementation of the asynchronous wreath product operation. The other, global cascade sequences, although conceptually and operationally similar to the local cascade product, translates to a more complex asynchronous implementation which uses the gossip automaton of Mukund and Sohoni. This leads to interesting applications to the characterization of trace languages definable in first-order logic: they are accepted by a restricted local cascade product of the gossip automaton and 2-state asynchronous reset automata, and also by a global cascade […]

23. Synthesis of Computable Regular Functions of Infinite Words

V. Dave ; E. Filiot ; S. Krishna ; N. Lhote.
Regular functions from infinite words to infinite words can be equivalently specified by MSO-transducers, streaming $\omega$-string transducers as well as deterministic two-way transducers with look-ahead. In their one-way restriction, the latter transducers define the class of rational functions. Even though regular functions are robustly characterised by several finite-state devices, even the subclass of rational functions may contain functions which are not computable (by a Turing machine with infinite input). This paper proposes a decision procedure for the following synthesis problem: given a regular function $f$ (equivalently specified by one of the aforementioned transducer model), is $f$ computable and if it is, synthesize a Turing machine computing it. For regular functions, we show that computability is equivalent to continuity, and therefore the problem boils down to deciding continuity. We establish a generic characterisation of continuity for functions preserving regular languages under inverse image (such as regular functions). We exploit this characterisation to show the decidability of continuity (and hence computability) of rational and regular functions. For rational functions, we show that this can be done in $\mathsf{NLogSpace}$ (it was already known to be in $\mathsf{PTime}$ by Prieur). In a similar fashion, we also effectively characterise uniform continuity of regular functions, and relate it to the notion of uniform computability, which offers […]