2009

Editors: Ralf Treinen, Jean Goubault-Larrecq

This special issue of Logical Methods in Computer Science contains extended and revised versions of eight papers selected by the Guest Editors from the thirty nine presented at the 24th Annual IEEE Symposium on Logic in Computer Science (LICS 2009), which took place on 11--14 August 2009 in Los Angeles, California, USA. The submissions for the special issue underwent a new and thorough reviewing and revision process, in accordance with the usual standards of LMCS. We thank the authors and the reviewers for all their hard work producing the contents of this issue.

The article on *The general vector addition system reachability problem by Presburger inductive invariants* by Jerome Leroux considers the reachability problem for vector addition systems and Petri nets. The author extends the concept of semi-linear sets (of configurations) to that of semi-pseudo-linear sets. He shows that they characterize the Parikh image of the set of reachable configurations of vector addition systems. He then uses this result to show that semi-linear sets are strong enough as inductive invariants for vector addition systems. Namely, for every unreachable configuration there is a semi-linear set that is an inductive invariant of the system that excludes this configuration. This inductive invariant can provide an unreachability certificate. Such unreachability certificates provide an alternative proof to the decidability of reachability of vector addition systems.

The article on *Non-deterministic Kleene coalgebras* by Alexandra Silva, Marcello Bonsangue and Jan Rutten studies a generalization of regular expressions to the setting of coalgebras for so-called `non-deterministic' endofunctors on the category of sets. The authors present a generic syntax of expressions associated with each such endofunctor. They define a semantics under which every expression denotes a behaviour, that is, an element of the final coalgebra; and they give a sound and complete axiomatization of bisimilarity of expressions. The work encompasses and extends the classic results of Kleene on regular languages and deterministic finite automata and of Milner on regular behaviours and finite labelled transition systems.

The article on *The complexity of global cardinality constraints* by Andrei A. Bulatov and Daniel Marx describes algorithms and hardness results for a particular case of constraint satisfaction problems. The exact setting is a constraint problem with counting constraints and an additional constraint language. The result in the paper characterizes, based on certain qualities of the additional constraint language, what is the complexity of the constraint problem. It characterizes the sets of languages for which the problem is polynomial and for which the problem is NP-complete.

The article on *Psi-calculi: a framework for mobile processes with nominal data and logic* by Jesper Bengtson, Magnus Johansson, Joachim Parrow and BjÃ¶rn Victor presents a new framework for defining pi-calculus-like process calculi equipped with nominal data terms, conditions and logical assertions. The nominal data can be transmitted between processes and their names can be statically scoped as in the standard pi-calculus. The phenomena described by many existing extensions of the pi-calculus, such as the applied pi-calculus, the spi-calculus, the fusion calculus and the concurrent constraint pi-calculus, are shown to fit within the psi-calculus framework. The authors develop a notion of labelled bisimulation for psi-calculi and give fully formalized proofs of congruence and algebraic properties. The formalization is carried out using the Nominal Package of the Isabelle/HOL prover and is probably the most extensive machine-assisted formalization of process calculi to date.

This special issue contains selected papers of the 20th International Conference on Rewriting Techniques and Applications (RTA 2009), which was held from June 29 to July 1, 2009, in Brasília, Brazil as part of 5th International Conference on Rewriting, Deduction, and Programming (RDP 2009).

RTA is the major forum for the presentation of research on all aspects of rewriting. Previous RTA conferences were held in Dijon (1985), Bordeaux (1987), Chapel Hill (1989), Como (1991), Montreal (1993), Kaiserslautern (1995), Rutgers (1996), Sitges (1997), Tsukuba (1998), Trento (1999), Norwich (2000), Utrecht (2001), Copenhagen (2002), Valencia (2003), Aachen (2004), Nara (2005), Seattle (2006), Paris (2007), and Hagenberg (2008).

Ten out of the the 26 papers presented at the conference were invited to submit an extended version to this special issue, including the two papers that had received the Best Paper Award at the conference: *An Explicit Framework for Interaction Nets* by Marc de Falco, and *From Outermost to Context-Sensitive Rewriting* by J&oumj;rg Endrullis and Dimitri Hendriks. Each submitted paper was refereed by at least two expert referees, at least one of them not having been a referee of the conference submission. Seven papers were finally accepted for inclusion in this special issue.

Our thanks go to the members of the Program Committee of RTA 2009 and their subreferees, as well as to all those who served in addition as reviewers for this special issue. We would also like to thank Benjamin Pierce, Managing Editor of LMCS, for having invited us to edit this special issue, and for his assistance and guidance.

Jean Goubault-Larrecq and Ralf Treinen

Guest Editors

Guest Editors

We define two transformations from term rewriting systems (TRSs) to context-sensitive TRSs in such a way that termination of the target system implies outermost termination of the original system. In the transformation based on 'context extension', each outermost rewrite step is modeled by exactly one step in the transformed system. This transformation turns out to be complete for the class of left-linear TRSs. The second transformation is called `dynamic labeling' and results in smaller sized context-sensitive TRSs. Here each modeled step is adjoined with a small number of auxiliary steps. As a result state-of-the-art termination methods for context-sensitive rewriting become available for proving termination of outermost rewriting. Both transformations have been implemented in Jambox, making it the most successful tool in the category of outermost rewriting of the last edition of the annual termination competition.

We consider the problem of intruder deduction in security protocol analysis: that is, deciding whether a given message M can be deduced from a set of messages Gamma under the theory of blind signatures and arbitrary convergent equational theories modulo associativity and commutativity (AC) of certain binary operators. The traditional formulations of intruder deduction are usually given in natural-deduction-like systems and proving decidability requires significant effort in showing that the rules are "local" in some sense. By using the well-known translation between natural deduction and sequent calculus, we recast the intruder deduction problem as proof search in sequent calculus, in which locality is immediate. Using standard proof theoretic methods, such as permutability of rules and cut elimination, we show that the intruder deduction problem can be reduced, in polynomial time, to the elementary deduction problem, which amounts to solving certain equations in the underlying individual equational theories. We show that this result extends to combinations of disjoint AC-convergent theories whereby the decidability of intruder deduction under the combined theory reduces to the decidability of elementary deduction in each constituent theory. To further demonstrate the utility of the sequent-based approach, we show that, for Dolev-Yao intruders, our sequent-based techniques can be used to solve the more difficult problem of solving deducibility constraints, where the […]

The Description Logic EL has recently drawn considerable attention since, on the one hand, important inference problems such as the subsumption problem are polynomial. On the other hand, EL is used to define large biomedical ontologies. Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. The main result of this paper is that unification in EL is decidable. More precisely, EL-unification is NP-complete, and thus has the same complexity as EL-matching. We also show that, w.r.t. the unification type, EL is less well-behaved: it is of type zero, which in particular implies that there are unification problems that have no finite complete set of unifiers.

Properties of Term Rewriting Systems are called modular iff they are preserved under (and reflected by) disjoint union, i.e. when combining two Term Rewriting Systems with disjoint signatures. Convergence is the property of Infinitary Term Rewriting Systems that all reduction sequences converge to a limit. Strong Convergence requires in addition that redex positions in a reduction sequence move arbitrarily deep. In this paper it is shown that both Convergence and Strong Convergence are modular properties of non-collapsing Infinitary Term Rewriting Systems, provided (for convergence) that the term metrics are granular. This generalises known modularity results beyond metric \infty.

Streams are infinite sequences over a given data type. A stream specification is a set of equations intended to define a stream. We propose a transformation from such a stream specification to a term rewriting system (TRS) in such a way that termination of the resulting TRS implies that the stream specification is well-defined, that is, admits a unique solution. As a consequence, proving well-definedness of several interesting stream specifications can be done fully automatically using present powerful tools for proving TRS termination. In order to increase the power of this approach, we investigate transformations that preserve semantics and well-definedness. We give examples for which the above mentioned technique applies for the ransformed specification while it fails for the original one.

The characterisation of termination using well-founded monotone algebras has been a milestone on the way to automated termination techniques, of which we have seen an extensive development over the past years. Both the semantic characterisation and most known termination methods are concerned with global termination, uniformly of all the terms of a term rewriting system (TRS). In this paper we consider local termination, of specific sets of terms within a given TRS. The principal goal of this paper is generalising the semantic characterisation of global termination to local termination. This is made possible by admitting the well-founded monotone algebras to be partial. We also extend our approach to local relative termination. The interest in local termination naturally arises in program verification, where one is probably interested only in sensible inputs, or just wants to characterise the set of inputs for which a program terminates. Local termination will be also be of interest when dealing with a specific class of terms within a TRS that is known to be non-terminating, such as combinatory logic (CL) or a TRS encoding recursive program schemes or Turing machines. We show how some of the well-known techniques for proving global termination, such as stepwise removal of rewrite rules and semantic labelling, can be adapted to the local case. We also describe transformations reducing local to global termination problems. The resulting techniques for proving local termination […]

Interaction nets are a graphical formalism inspired by Linear Logic proof-nets often used for studying higher order rewriting e.g. \Beta-reduction. Traditional presentations of interaction nets are based on graph theory and rely on elementary properties of graph theory. We give here a more explicit presentation based on notions borrowed from Girard's Geometry of Interaction: interaction nets are presented as partial permutations and a composition of nets, the gluing, is derived from the execution formula. We then define contexts and reduction as the context closure of rules. We prove strong confluence of the reduction within our framework and show how interaction nets can be viewed as the quotient of some generalized proof-nets.