Volume 2, Issue 1

2006


1. Computably Based Locally Compact Spaces

Paul Taylor.
ASD (Abstract Stone Duality) is a re-axiomatisation of general topology in which the topology on a space is treated, not as an infinitary lattice, but as an exponential object of the same category as the original space, with an associated lambda-calculus. In this paper, this is shown to be equivalent to a notion of computable basis for locally compact sober spaces or locales, involving a family of open subspaces and accompanying family of compact ones. This generalises Smyth's effectively given domains and Jung's strong proximity lattices. Part of the data for a basis is the inclusion relation of compact subspaces within open ones, which is formulated in locale theory as the way-below relation on a continuous lattice. The finitary properties of this relation are characterised here, including the Wilker condition for the cover of a compact space by two open ones. The real line is used as a running example, being closely related to Scott's domain of intervals. ASD does not use the category of sets, but the full subcategory of overt discrete objects plays this role; it is an arithmetic universe (pretopos with lists). In particular, we use this subcategory to translate computable bases for classical spaces into objects in the ASD calculus.

2. Model Checking Probabilistic Pushdown Automata

Javier Esparza ; Antonin Kucera ; Richard Mayr.
We consider the model checking problem for probabilistic pushdown automata (pPDA) and properties expressible in various probabilistic logics. We start with properties that can be formulated as instances of a generalized random walk problem. We prove that both qualitative and quantitative model checking for this class of properties and pPDA is decidable. Then we show that model checking for the qualitative fragment of the logic PCTL and pPDA is also decidable. Moreover, we develop an error-tolerant model checking algorithm for PCTL and the subclass of stateless pPDA. Finally, we consider the class of omega-regular properties and show that both qualitative and quantitative model checking for pPDA is decidable.

3. Theories for TC0 and Other Small Complexity Classes

Phuong Nguyen ; Stephen Cook.
We present a general method for introducing finitely axiomatizable "minimal" two-sorted theories for various subclasses of P (problems solvable in polynomial time). The two sorts are natural numbers and finite sets of natural numbers. The latter are essentially the finite binary strings, which provide a natural domain for defining the functions and sets in small complexity classes. We concentrate on the complexity class TC^0, whose problems are defined by uniform polynomial-size families of bounded-depth Boolean circuits with majority gates. We present an elegant theory VTC^0 in which the provably-total functions are those associated with TC^0, and then prove that VTC^0 is "isomorphic" to a different-looking single-sorted theory introduced by Johannsen and Pollet. The most technical part of the isomorphism proof is defining binary number multiplication in terms a bit-counting function, and showing how to formalize the proofs of its algebraic properties.

4. Approximate reasoning for real-time probabilistic processes

Vineet Gupta ; Radha Jagadeesan ; Prakash Panangaden.
We develop a pseudo-metric analogue of bisimulation for generalized semi-Markov processes. The kernel of this pseudo-metric corresponds to bisimulation; thus we have extended bisimulation for continuous-time probabilistic processes to a much broader class of distributions than exponential distributions. This pseudo-metric gives a useful handle on approximate reasoning in the presence of numerical information -- such as probabilities and time -- in the model. We give a fixed point characterization of the pseudo-metric. This makes available coinductive reasoning principles for reasoning about distances. We demonstrate that our approach is insensitive to potentially ad hoc articulations of distance by showing that it is intrinsic to an underlying uniformity. We provide a logical characterization of this uniformity using a real-valued modal logic. We show that several quantitative properties of interest are continuous with respect to the pseudo-metric. Thus, if two processes are metrically close, then observable quantitative properties of interest are indeed close.

5. Algorithmic correspondence and completeness in modal logic. I. The core algorithm SQEMA

Willem Conradie ; Valentin Goranko ; Dimiter Vakarelov.
Modal formulae express monadic second-order properties on Kripke frames, but in many important cases these have first-order equivalents. Computing such equivalents is important for both logical and computational reasons. On the other hand, canonicity of modal formulae is important, too, because it implies frame-completeness of logics axiomatized with canonical formulae. Computing a first-order equivalent of a modal formula amounts to elimination of second-order quantifiers. Two algorithms have been developed for second-order quantifier elimination: SCAN, based on constraint resolution, and DLS, based on a logical equivalence established by Ackermann. In this paper we introduce a new algorithm, SQEMA, for computing first-order equivalents (using a modal version of Ackermann's lemma) and, moreover, for proving canonicity of modal formulae. Unlike SCAN and DLS, it works directly on modal formulae, thus avoiding Skolemization and the subsequent problem of unskolemization. We present the core algorithm and illustrate it with some examples. We then prove its correctness and the canonicity of all formulae on which the algorithm succeeds. We show that it succeeds not only on all Sahlqvist formulae, but also on the larger class of inductive formulae, introduced in our earlier papers. Thus, we develop a purely algorithmic approach to proving canonical completeness in modal logic and, in particular, establish one of the most general completeness results in modal logic so far.

6. Extending the theory of Owicki and Gries with a logic of progress

Brijesh Dongol ; Doug Goldson.
This paper describes a logic of progress for concurrent programs. The logic is based on that of UNITY, molded to fit a sequential programming model. Integration of the two is achieved by using auxiliary variables in a systematic way that incorporates program counters into the program text. The rules for progress in UNITY are then modified to suit this new system. This modification is however subtle enough to allow the theory of Owicki and Gries to be used without change.