Selected Papers of the Conference "Typed Lambda Calculi and Applications 2009"

2009

Editor: Pierre-Louis Curien

This special issue of Logical Methods in Computer Science contains extended and revised versions of selected papers from the Conference on Typed Lambda Calculi and Applications (TLCA '09), held in Brasilia, Brazil, on July 1-3, 2009. All submissions for this special issue underwent a new reviewing process, in accordance with the usual high standards of LMCS.

We think that together the papers of this volume cover a large range of active research directions within the field of lambda calculi and applications and illustrate the vitality of this area.

• The papers "Semantics of Typed Lambda-Calculus with Constructors" (by B. Petit) and "Derivation Lengths Classification of Gödel's T Extending Howard's Assignment" (by G. Wilken, A. Weiermann) study extensions of lambda calculus and their dynamic properties.
• The article "A Logical Foundation for Environment Classifiers" (by T. Tsukada and A. Igarashi) deals with the logical approach to the theory of multistage programming.
• Three papers then explore the area of type theory and logical frameworks, either from a syntactic or from a semantic point of view: "Refinement Types for Logical Frameworks and Their Interpretation as Proof Irrelevance" (by W. Lovas and F. Pfenning), "A Modular Type-checking algorithm for Type Theory with Singleton Types and Proof Irrelevance" (by A. Abel, T. Coquand and M. Pagano), "Weak omega-categories from intensional type theory" (by P. Lumsdaine).
• The topic of categorical approaches to universal algebra is also addressed, with the papers "On the mathematical synthesis of equational logics" (by M. Fiore and C.-K. Hur) and "Initial Algebra Semantics for Cyclic Sharing Tree Structures" (by M. Hamana).
• Two papers deal with linear logic or game-interpretation of proofs: "Bounded Linear Logic, Revisited" (by U. Dal Lago and M. Hofmann), "On the meaning of logical completeness" (by M. Basaldella and K. Terui).
• Finally the topic of realizability is considered by four papers, with applications to the study of classical logic and to relative consistency results in set theory: "Existential witness extraction in classical realizability and via a negative translation" (by A. Miquel), "Interactive Learning-Based Realizability for Heyting Arithmetic with EM1" (by F. Aschieri and S. Berardi), "Realizability algebras: a program to well order R", "Realizability algebras II : new models of ZF + DC" (by J.-L. Krivine).

We thank the authors for their contributions, the reviewers for their hard work, and the managing editor for special issues Benjamin Pierce for encouraging us to edit this volume.

Pierre-Louis Curien
PC chair of TLCA'09 and guest editor
Patrick Baillot, Luca Paolini
PC members and guest editors

1. Initial Algebra Semantics for Cyclic Sharing Tree Structures

Terms are a concise representation of tree structures. Since they can be naturally defined by an inductive type, they offer data structures in functional programming and mechanised reasoning with useful principles such as structural induction and structural recursion. However, for graphs or "tree-like" structures - trees involving cycles and sharing - it remains unclear what kind of inductive structures exists and how we can faithfully assign a term representation of them. In this paper we propose a simple term syntax for cyclic sharing structures that admits structural induction and recursion principles. We show that the obtained syntax is directly usable in the functional language Haskell and the proof assistant Agda, as well as ordinary data structures such as lists and trees. To achieve this goal, we use a categorical approach to initial algebra semantics in a presheaf category. That approach follows the line of Fiore, Plotkin and Turi's models of abstract syntax with variable binding.

2. Interactive Learning-Based Realizability for Heyting Arithmetic with EM1

We apply to the semantics of Arithmetic the idea of finite approximation'' used to provide computational interpretations of Herbrand's Theorem, and we interpret classical proofs as constructive proofs (with constructive rules for $\vee, \exists$) over a suitable structure $\StructureN$ for the language of natural numbers and maps of Gödel's system $\SystemT$. We introduce a new Realizability semantics we call Interactive learning-based Realizability'', for Heyting Arithmetic plus $\EM_1$ (Excluded middle axiom restricted to $\Sigma^0_1$ formulas). Individuals of $\StructureN$ evolve with time, and realizers may interact'' with them, by influencing their evolution. We build our semantics over Avigad's fixed point result, but the same semantics may be defined over different constructive interpretations of classical arithmetic (Berardi and de' Liguoro use continuations). Our notion of realizability extends intuitionistic realizability and differs from it only in the atomic case: we interpret atomic realizers as learning agents''.

3. Weak omega-categories from intensional type theory

We show that for any type in Martin-Löf Intensional Type Theory, the terms of that type and its higher identity types form a weak omega-category in the sense of Leinster. Precisely, we construct a contractible globular operad of definable composition laws, and give an action of this operad on the terms of any type and its identity types.

4. Refinement Types for Logical Frameworks and Their Interpretation as Proof Irrelevance

Refinement types sharpen systems of simple and dependent types by offering expressive means to more precisely classify well-typed terms. We present a system of refinement types for LF in the style of recent formulations where only canonical forms are well-typed. Both the usual LF rules and the rules for type refinements are bidirectional, leading to a straightforward proof of decidability of typechecking even in the presence of intersection types. Because we insist on canonical forms, structural rules for subtyping can now be derived rather than being assumed as primitive. We illustrate the expressive power of our system with examples and validate its design by demonstrating a precise correspondence with traditional presentations of subtyping. Proof irrelevance provides a mechanism for selectively hiding the identities of terms in type theories. We show that LF refinement types can be interpreted as predicates using proof irrelevance, establishing a uniform relationship between two previously studied concepts in type theory. The interpretation and its correctness proof are surprisingly complex, lending support to the claim that refinement types are a fundamental construct rather than just a convenient surface syntax for certain uses of proof irrelevance.

5. Bounded Linear Logic, Revisited

We present QBAL, an extension of Girard, Scedrov and Scott's bounded linear logic. The main novelty of the system is the possibility of quantifying over resource variables. This generalization makes bounded linear logic considerably more flexible, while preserving soundness and completeness for polynomial time. In particular, we provide compositional embeddings of Leivant's RRW and Hofmann's LFPL into QBAL.

6. A Logical Foundation for Environment Classifiers

Taha and Nielsen have developed a multi-stage calculus {\lambda}{\alpha} with a sound type system using the notion of environment classifiers. They are special identifiers, with which code fragments and variable declarations are annotated, and their scoping mechanism is used to ensure statically that certain code fragments are closed and safely runnable. In this paper, we investigate the Curry-Howard isomorphism for environment classifiers by developing a typed {\lambda}-calculus {\lambda}|>. It corresponds to multi-modal logic that allows quantification by transition variables---a counterpart of classifiers---which range over (possibly empty) sequences of labeled transitions between possible worlds. This interpretation will reduce the "run" construct---which has a special typing rule in {\lambda}{\alpha}---and embedding of closed code into other code fragments of different stages---which would be only realized by the cross-stage persistence operator in {\lambda}{\alpha}---to merely a special case of classifier application. {\lambda}|> enjoys not only basic properties including subject reduction, confluence, and strong normalization but also an important property as a multi-stage calculus: time-ordered normalization of full reduction. Then, we develop a big-step evaluation semantics for an ML-like language based on {\lambda}|> with its type system and prove that the evaluation of a well-typed {\lambda}|> program is properly staged. We also identify a […]

7. On the meaning of logical completeness

Goedel's completeness theorem is concerned with provability, while Girard's theorem in ludics (as well as full completeness theorems in game semantics) are concerned with proofs. Our purpose is to look for a connection between these two disciplines. Following a previous work [3], we consider an extension of the original ludics with contraction and universal nondeterminism, which play dual roles, in order to capture a polarized fragment of linear logic and thus a constructive variant of classical propositional logic. We then prove a completeness theorem for proofs in this extended setting: for any behaviour (formula) A and any design (proof attempt) P, either P is a proof of A or there is a model M of the orthogonal of A which defeats P. Compared with proofs of full completeness in game semantics, ours exhibits a striking similarity with proofs of Goedel's completeness, in that it explicitly constructs a countermodel essentially using Koenig's lemma, proceeds by induction on formulas, and implies an analogue of Loewenheim-Skolem theorem.

8. Semantics of Typed Lambda-Calculus with Constructors

We present a Curry-style second-order type system with union and intersection types for the lambda-calculus with constructors of Arbiser, Miquel and Rios, an extension of lambda-calculus with a pattern matching mechanism for variadic constructors. We then prove the strong normalisation and the absence of match failure for a restriction of this system, by adapting the standard reducibility method.

9. Existential witness extraction in classical realizability and via a negative translation

We show how to extract existential witnesses from classical proofs using Krivine's classical realizability---where classical proofs are interpreted as lambda-terms with the call/cc control operator. We first recall the basic framework of classical realizability (in classical second-order arithmetic) and show how to extend it with primitive numerals for faster computations. Then we show how to perform witness extraction in this framework, by discussing several techniques depending on the shape of the existential formula. In particular, we show that in the Sigma01-case, Krivine's witness extraction method reduces to Friedman's through a well-suited negative translation to intuitionistic second-order arithmetic. Finally we discuss the advantages of using call/cc rather than a negative translation, especially from the point of view of an implementation.

10. A Modular Type-checking algorithm for Type Theory with Singleton Types and Proof Irrelevance

We define a logical framework with singleton types and one universe of small types. We give the semantics using a PER model; it is used for constructing a normalisation-by-evaluation algorithm. We prove completeness and soundness of the algorithm; and get as a corollary the injectivity of type constructors. Then we give the definition of a correct and complete type-checking algorithm for terms in normal form. We extend the results to proof-irrelevant propositions.

11. Realizability algebras: a program to well order R

The theory of classical realizability is a framework in which we can develop the proof-program correspondence. Using this framework, we show how to transform into programs the proofs in classical analysis with dependent choice and the existence of a well ordering of the real line. The principal tools are: The notion of realizability algebra, which is a three-sorted variant of the well known combinatory algebra of Curry. An adaptation of the method of forcing used in set theory to prove consistency results. Here, it is used in another way, to obtain programs associated with a well ordering of R and the existence of a non trivial ultrafilter on N.

12. On the mathematical synthesis of equational logics

We provide a mathematical theory and methodology for synthesising equational logics from algebraic metatheories. We illustrate our methodology by means of two applications: a rational reconstruction of Birkhoff's Equational Logic and a new equational logic for reasoning about algebraic structure with name-binding operators.

13. Realizability algebras II : new models of ZF + DC

Using the proof-program (Curry-Howard) correspondence, we give a new method to obtain models of ZF and relative consistency results in set theory. We show the relative consistency of ZF + DC + there exists a sequence of subsets of R the cardinals of which are strictly decreasing + other similar properties of R. These results seem not to have been previously obtained by forcing.

14. Derivation Lengths Classification of Gödel's T Extending Howard's Assignment

Let T be Goedel's system of primitive recursive functionals of finite type in the lambda formulation. We define by constructive means using recursion on nested multisets a multivalued function I from the set of terms of T into the set of natural numbers such that if a term a reduces to a term b and if a natural number I(a) is assigned to a then a natural number I(b) can be assigned to b such that I(a) is greater than I(b). The construction of I is based on Howard's 1970 ordinal assignment for T and Weiermann's 1996 treatment of T in the combinatory logic version. As a corollary we obtain an optimal derivation length classification for the lambda formulation of T and its fragments. Compared with Weiermann's 1996 exposition this article yields solutions to several non-trivial problems arising from dealing with lambda terms instead of combinatory logic terms. It is expected that the methods developed here can be applied to other higher order rewrite systems resulting in new powerful termination orderings since T is a paradigm for such systems.