2010 Editors: Henk Barendregt, Pawel Urzyczyn
This paper describes a formalization of discrete real closed fields in the Coq proof assistant. This abstract structure captures for instance the theory of real algebraic numbers, a decidable subset of real numbers with good algorithmic properties. The theory of real algebraic numbers and more generally of semi-algebraic varieties is at the core of a number of effective methods in real analysis, including decision procedures for non linear arithmetic or optimization methods for real valued functions. After defining an abstract structure of discrete real closed field and the elementary theory of real roots of polynomials, we describe the formalization of an algebraic proof of quantifier elimination based on pseudo-remainder sequences following the standard computer algebra literature on the topic. This formalization covers a large part of the theory which underlies the efficient algorithms implemented in practice in computer algebra. The success of this work paves the way for formal certification of these efficient methods.
This paper defines a sound and complete semantic criterion, based on reducibility candidates, for strong normalization of theories expressed in minimal deduction modulo à la Curry. The use of Curry-style proof-terms allows to build this criterion on the classic notion of pre-Heyting algebras and makes that criterion concern all theories expressed in minimal deduction modulo. Compared to using Church-style proof-terms, this method provides both a simpler definition of the criterion and a simpler proof of its completeness.
The Algebraic lambda-calculus and the Linear-Algebraic lambda-calculus extend the lambda-calculus with the possibility of making arbitrary linear combinations of terms. In this paper we provide a fine-grained, System F-like type system for the linear-algebraic lambda-calculus. We show that this "scalar" type system enjoys both the subject-reduction property and the strong-normalisation property, our main technical results. The latter yields a significant simplification of the linear-algebraic lambda-calculus itself, by removing the need for some restrictions in its reduction rules. But the more important, original feature of this scalar type system is that it keeps track of 'the amount of a type' that is present in each term. As an example of its use, we shown that it can serve as a guarantee that the normal form of a term is barycentric, i.e that its scalars are summing to one.
Nominal abstract syntax is an approach to representing names and binding pioneered by Gabbay and Pitts. So far nominal techniques have mostly been studied using classical logic or model theory, not type theory. Nominal extensions to simple, dependent and ML-like polymorphic languages have been studied, but decidability and normalization results have only been established for simple nominal type theories. We present a LF-style dependent type theory extended with name-abstraction types, prove soundness and decidability of beta-eta-equivalence checking, discuss adequacy and canonical forms via an example, and discuss extensions such as dependently-typed recursion and induction principles.
We present a reflexive tactic for deciding the equational theory of Kleene algebras in the Coq proof assistant. This tactic relies on a careful implementation of efficient finite automata algorithms, so that it solves casual equations instantaneously and properly scales to larger expressions. The decision procedure is proved correct and complete: correctness is established w.r.t. any model by formalising Kozen's initiality theorem; a counter-example is returned when the given equation does not hold. The correctness proof is challenging: it involves both a precise analysis of the underlying automata algorithms and a lot of algebraic reasoning. In particular, we have to formalise the theory of matrices over a Kleene algebra. We build on the recent addition of firstorder typeclasses in Coq in order to work efficiently with the involved algebraic structures.
The paper describes the refinement algorithm for the Calculus of (Co)Inductive Constructions (CIC) implemented in the interactive theorem prover Matita. The refinement algorithm is in charge of giving a meaning to the terms, types and proof terms directly written by the user or generated by using tactics, decision procedures or general automation. The terms are written in an "external syntax" meant to be user friendly that allows omission of information, untyped binders and a certain liberal use of user defined sub-typing. The refiner modifies the terms to obtain related well typed terms in the internal syntax understood by the kernel of the ITP. In particular, it acts as a type inference algorithm when all the binders are untyped. The proposed algorithm is bi-directional: given a term in external syntax and a type expected for the term, it propagates as much typing information as possible towards the leaves of the term. Traditional mono-directional algorithms, instead, proceed in a bottom-up way by inferring the type of a sub-term and comparing (unifying) it with the type expected by its context only at the end. We propose some novel bi-directional rules for CIC that are particularly effective. Among the benefits of bi-directionality we have better error message reporting and better inference of dependent types. Moreover, thanks to bi-directionality, the coercion system for sub-typing is more effective and type inference generates simpler unification problems that […]
Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To separate static and dynamic code, several static analyses and type systems have been put forward. We consider Pfenning's type theory with irrelevant quantification which is compatible with a type-based notion of equality that respects eta-laws. We extend Pfenning's theory to universes and large eliminations and develop its meta-theory. Subject reduction, normalization and consistency are obtained by a Kripke model over the typed equality judgement. Finally, a type-directed equality algorithm is described whose completeness is proven by a second Kripke model.
We propose a synthesis of the two proof styles of interactive theorem proving: the procedural style (where proofs are scripts of commands, like in Coq) and the declarative style (where proofs are texts in a controlled natural language, like in Isabelle/Isar). Our approach combines the advantages of the declarative style - the possibility to write formal proofs like normal mathematical text - and the procedural style - strong automation and help with shaping the proofs, including determining the statements of intermediate steps. Our approach is new, and differs significantly from the ways in which the procedural and declarative proof styles have been combined before in the Isabelle, Ssreflect and Matita systems. Our approach is generic and can be implemented on top of any procedural interactive theorem prover, regardless of its architecture and logical foundations. To show the viability of our proposed approach, we fully implemented it as a proof interface called miz3, on top of the HOL Light interactive theorem prover. The declarative language that this interface uses is a slight variant of the language of the Mizar system, and can be used for any interactive theorem prover regardless of its logical foundations. The miz3 interface allows easy access to the full set of tactics and formal libraries of HOL Light, and as such has "industrial strength". Our approach gives a way to automatically convert any procedural proof to a declarative counterpart, where the […]
Mixing induction and coinduction, we study alternative definitions of streams being finitely red. We organize our definitions into a hierarchy including also some well-known alternatives in intuitionistic analysis. The hierarchy collapses classically, but is intuitionistically of strictly decreasing strength. We characterize the differences in strength in a precise way by weak instances of the Law of Excluded Middle.