Selected Papers of the "24th International Conference on Concurrency Theory" CONCUR 2013

2013 Editors: Pedro R. D'Argenio, Davide Sangiorgi, Hernán Melgratti


1. The Power of Priority Channel Systems

Christoph Haase ; Sylvain Schmitz ; Philippe Schnoebelen.
We introduce Priority Channel Systems, a new class of channel systems where messages carry a numeric priority and where higher-priority messages can supersede lower-priority messages preceding them in the fifo communication buffers. The decidability of safety and inevitability properties is shown via the introduction of a priority embedding, a well-quasi-ordering that has not previously been used in well-structured systems. We then show how Priority Channel Systems can compute Fast-Growing functions and prove that the aforementioned verification problems are $\mathbf{F}_{\varepsilon_{0}}$-complete.

2. Cost Preserving Bisimulations for Probabilistic Automata

Andrea Turrini ; Holger Hermanns.
Probabilistic automata constitute a versatile and elegant model for concurrent probabilistic systems. They are equipped with a compositional theory supporting abstraction, enabled by weak probabilistic bisimulation serving as the reference notion for summarising the effect of abstraction. This paper considers probabilistic automata augmented with costs. It extends the notions of weak transitions in probabilistic automata in such a way that the costs incurred along a weak transition are captured. This gives rise to cost-preserving and cost-bounding variations of weak probabilistic bisimilarity, for which we establish compositionality properties with respect to parallel composition. Furthermore, polynomial-time decision algorithms are proposed, that can be effectively used to compute reward-bounding abstractions of Markov decision processes in a compositional manner.

3. Aspect-oriented linearizability proofs

Soham Chakraborty ; Thomas A. Henzinger ; Ali Sezgin ; Viktor Vafeiadis.
Linearizability of concurrent data structures is usually proved by monolithic simulation arguments relying on the identification of the so-called linearization points. Regrettably, such proofs, whether manual or automatic, are often complicated and scale poorly to advanced non-blocking concurrency patterns, such as helping and optimistic updates. In response, we propose a more modular way of checking linearizability of concurrent queue algorithms that does not involve identifying linearization points. We reduce the task of proving linearizability with respect to the queue specification to establishing four basic properties, each of which can be proved independently by simpler arguments. As a demonstration of our approach, we verify the Herlihy and Wing queue, an algorithm that is challenging to verify by a simulation proof.

4. Mutually Testing Processes

Giovanni Bernardi ; Matthew Hennessy.
In the standard testing theory of DeNicola-Hennessy one process is considered to be a refinement of another if every test guaranteed by the former is also guaranteed by the latter. In the domain of web services this has been recast, with processes viewed as servers and tests as clients. In this way the standard refinement preorder between servers is determined by their ability to satisfy clients. But in this setting there is also a natural refinement preorder between clients, determined by their ability to be satisfied by servers. In more general settings where there is no distinction between clients and servers, but all processes are peers, there is a further refinement preorder based on the mutual satisfaction of peers. We give a uniform account of these three preorders. In particular we give two characterisations. The first is behavioural, in terms of traces and ready sets. The second, for finite processes, is equational.

5. Thermodynamic graph-rewriting

Vincent Danos ; Russell Harmer ; Ricardo Honorato-Zimmer.
We develop a new thermodynamic approach to stochastic graph-rewriting. The ingredients are a finite set of reversible graph-rewriting rules called generating rules, a finite set of connected graphs P called energy patterns and an energy cost function. The idea is that the generators define the qualitative dynamics, by showing which transformations are possible, while the energy patterns and cost function specify the long-term probability $\pi$ of any reachable graph. Given the generators and energy patterns, we construct a finite set of rules which (i) has the same qualitative transition system as the generators; and (ii) when equipped with suitable rates, defines a continuous-time Markov chain of which $\pi$ is the unique fixed point. The construction relies on the use of site graphs and a technique of `growth policy' for quantitative rule refinement which is of independent interest. This division of labour between the qualitative and long-term quantitative aspects of the dynamics leads to intuitive and concise descriptions for realistic models (see the examples in S4 and S5). It also guarantees thermodynamical consistency (AKA detailed balance), otherwise known to be undecidable, which is important for some applications. Finally, it leads to parsimonious parameterizations of models, again an important point in some applications.