We introduce refutationally complete superposition calculi for intentional
and extensional clausal $\lambda$-free higher-order logic, two formalisms that
allow partial application and applied variables. The calculi are parameterized
by a term order that need not be fully monotonic, making it possible to employ
the $\lambda$-free higher-order lexicographic path and Knuth-Bendix orders. We
implemented the calculi in the Zipperposition prover and evaluated them on
Isabelle/HOL and TPTP benchmarks. They appear promising as a stepping stone
towards complete, highly efficient automatic theorem provers for full
higher-order logic.
Probabilistic logic programming is increasingly important in artificial
intelligence and related fields as a formalism to reason about uncertainty. It
generalises logic programming with the possibility of annotating clauses with
probabilities. This paper proposes a coalgebraic semantics on probabilistic
logic programming. Programs are modelled as coalgebras for a certain functor F,
and two semantics are given in terms of cofree coalgebras. First, the
F-coalgebra yields a semantics in terms of derivation trees. Second, by
embedding F into another type G, as cofree G-coalgebra we obtain a `possible
worlds' interpretation of programs, from which one may recover the usual
distribution semantics of probabilistic logic programming. Furthermore, we show
that a similar approach can be used to provide a coalgebraic semantics to
weighted logic programming.
A labelled Markov process (LMP) consists of a measurable space $S$ together
with an indexed family of Markov kernels from $S$ to itself. This structure has
been used to model probabilistic computations in Computer Science, and one of
the main problems in the area is to define and decide whether two LMP $S$ and
$S'$ "behave the same". There are two natural categorical definitions of
sameness of behavior: $S$ and $S'$ are bisimilar if there exist an LMP $ T$ and
measure preserving maps forming a diagram of the shape $ S\leftarrow T
\rightarrow{S'}$; and they are behaviorally equivalent if there exist some $ U$
and maps forming a dual diagram $ S\rightarrow U \leftarrow{S'}$.
These two notions differ for general measurable spaces but Doberkat
(extending a result by Edalat) proved that they coincide for analytic Borel
spaces, showing that from every diagram $S\rightarrow U \leftarrow{S'}$ one can
obtain a bisimilarity diagram as above. Moreover, the resulting square of
measure preserving maps is commutative (a semipullback).
In this paper, we extend the previous result to measurable spaces $S$
isomorphic to a universally measurable subset of a Polish space with the trace
of the Borel $\sigma$-algebra, using a version of Strassen's theorem on common
extensions of finitely additive measures.
We apply fundamental notions of Bishop set theory (BST), an informal theory
that complements Bishop's theory of sets, to the theory of Bishop spaces, a
function-theoretic approach to constructive topology. Within BST we develop the
notions of a direct family of sets, of a direct spectrum of Bishop spaces, of
the direct limit of a direct spectrum of Bishop spaces, and of the inverse
limit of a contravariant direct spectrum of Bishop spaces. Within the extension
of Bishop's informal system of constructive mathematics BISH with inductive
definitions with rules of countably many premises, we prove the fundamental
theorems on the direct and inverse limits of spectra of Bishop spaces and the
duality principle between them.
We give a formalization of Pratt's intuitive sculpting process for
higher-dimensional automata (HDA). Intuitively, an HDA is a sculpture if it can
be embedded in (i.e., sculpted from) a single higher dimensional cell
(hypercube). A first important result of this paper is that not all HDA can be
sculpted, exemplified through several natural acyclic HDA, one being the famous
"broken box" example of van Glabbeek. Moreover, we show that even the natural
operation of unfolding is completely unrelated to sculpting, e.g., there are
sculptures whose unfoldings cannot be sculpted. We investigate the
expressiveness of sculptures, as a proper subclass of HDA, by showing them to
be equivalent to regular ST-structures (an event-based counterpart of HDA) and
to (regular) Chu spaces over 3 (in their concurrent interpretation given by
Pratt). We believe that our results shed new light on the intuitions behind
sculpting as a method of modeling concurrent behavior, showing the precise
reaches of its expressiveness. Besides expressiveness, we also develop an
algorithm to decide whether an HDA can be sculpted. More importantly, we show
that sculptures are equivalent to Euclidean cubical complexes (being the
geometrical counterpart of our combinatorial definition), which include the
popular PV models used for deadlock detection. This exposes a close connection
between geometric and combinatorial models for concurrency which may be of use
for both areas.
We develop an algebraic language theory based on the notion of an
Eilenberg--Moore algebra. In comparison to previous such frameworks the main
contribution is the support for algebras with infinitely many sorts and the
connection to logic in form of so-called `definable algebras'.
Continuing earlier work of the first author with U. Berger, K. Miyamoto and
H. Tsuiki, it is shown how a division algorithm for real numbers given as a
stream of signed digits can be extracted from an appropriate formal proof. The
property of being a real number represented as a stream is formulated by means
of coinductively defined predicates, and formal proofs involve coinduction. The
proof assistant Minlog is used to generate the formal proofs and extract their
computational content as terms of the underlying theory, a form of type theory
for finite or infinite data. Some experiments with running the extracted term
are described, after its translation to Haskell.
In this paper, we study finitary 1-truncated higher inductive types (HITs) in
homotopy type theory. We start by showing that all these types can be
constructed from the groupoid quotient. We define an internal notion of
signatures for HITs, and for each signature, we construct a bicategory of
algebras in 1-types and in groupoids. We continue by proving initial algebra
semantics for our signatures. After that, we show that the groupoid quotient
induces a biadjunction between the bicategories of algebras in 1-types and in
groupoids. Then we construct a biinitial object in the bicategory of algebras
in groupoids, which gives the desired algebra. From all this, we conclude that
all finitary 1-truncated HITs can be constructed from the groupoid quotient.
We present several examples of HITs which are definable using our notion of
signature. In particular, we show that each signature gives rise to a HIT
corresponding to the freely generated algebraic structure over it. We also
start the development of universal algebra in 1-types. We show that the
bicategory of algebras has PIE limits, i.e. products, inserters and equifiers,
and we prove a version of the first isomorphism theorem for 1-types. Finally,
we give an alternative characterization of the foundamental groups of some
HITs, exploiting our construction of HITs via the groupoid quotient. All the
results are formalized over the UniMath library of univalent mathematics in
Coq.
We describe a type system with mixed linear and non-linear recursive types
called LNL-FPC (the linear/non-linear fixpoint calculus). The type system
supports linear typing, which enhances the safety properties of programs, but
also supports non-linear typing as well, which makes the type system more
convenient for programming. Just as in FPC, we show that LNL-FPC supports
type-level recursion, which in turn induces term-level recursion. We also
provide sound and computationally adequate categorical models for LNL-FPC that
describe the categorical structure of the substructural operations of
Intuitionistic Linear Logic at all non-linear types, including the recursive
ones. In order to do so, we describe a new technique for solving recursive
domain equations within cartesian categories by constructing the solutions over
pre-embeddings. The type system also enjoys implicit weakening and contraction
rules that we are able to model by identifying the canonical comonoid structure
of all non-linear types. We also show that the requirements of our abstract
model are reasonable by constructing a large class of concrete models that have
found applications not only in classical functional programming, but also in
emerging programming paradigms that incorporate linear types, such as quantum
programming and circuit description programming languages.
We study the problem of deciding the winner of reachability switching games
for zero-, one-, and two-player variants. Switching games provide a
deterministic analogue of stochastic games. We show that the zero-player case
is NL-hard, the one-player case is NP-complete, and that the two-player case is
PSPACE-hard and in EXPTIME. For the zero-player case, we also show P-hardness
for a succinctly-represented model that maintains the upper bound of NP $\cap$
coNP. For the one- and two-player cases, our results hold in both the natural,
explicit model and succinctly-represented model. Our results show that the
switching variant of a game is harder in complexity-theoretic terms than the
corresponding stochastic version.
This paper extends a standard process algebra with a time-out operator,
thereby increasing its absolute expressiveness, while remaining within the
realm of untimed process algebra, in the sense that the progress of time is not
quantified. Trace and failures equivalence fail to be congruences for this
operator; their congruence closure is characterised as failure trace
equivalence.
We study the complexity of a range of propositional proof systems which allow
inference rules of the form: from a set of clauses $\Gamma$ derive the set of
clauses $\Gamma \cup \{ C \}$ where, due to some syntactic condition, $\Gamma
\cup \{ C \}$ is satisfiable if $\Gamma$ is, but where $\Gamma$ does not
necessarily imply $C$. These inference rules include BC, RAT, SPR and PR
(respectively short for blocked clauses, resolution asymmetric tautologies,
subset propagation redundancy and propagation redundancy), which arose from
work in satisfiability (SAT) solving. We introduce a new, more general rule SR
(substitution redundancy).
If the new clause $C$ is allowed to include new variables then the systems
based on these rules are all equivalent to extended resolution. We focus on
restricted systems that do not allow new variables. The systems with deletion,
where we can delete a clause from our set at any time, are denoted DBC${}^-$,
DRAT${}^-$, DSPR${}^-$, DPR${}^-$ and DSR${}^-$. The systems without deletion
are BC${}^-$, RAT${}^-$, SPR${}^-$, PR${}^-$ and SR${}^-$.
With deletion, we show that DRAT${}^-$, DSPR${}^-$ and DPR${}^-$ are
equivalent. By earlier work of Kiesl, Rebola-Pardo and Heule, they are also
equivalent to DBC${}^-$. Without deletion, we show that SPR${}^-$ can simulate
PR${}^-$ provided only short clauses are inferred by SPR inferences. We also
show that many of the well-known "hard" principles have small SPR${}^-$
refutations. These include the […]
We introduce a new class of Parametric Timed Automata (PTAs) where we allow
clocks to be compared to parameters in guards, as in classic PTAs, but also to
be updated to parameters. We focus here on the EF-emptiness problem: "is the
set of parameter valuations for which some given location is reachable in the
instantiated timed automaton empty?". This problem is well-known to be
undecidable for PTAs, and so it is for our extension. Nonetheless, if we update
all clocks each time we compare a clock with a parameter and each time we
update a clock to a parameter, we obtain a syntactic subclass for which we can
decide the EF-emptiness problem and even perform the exact synthesis of the set
of rational valuations such that a given location is reachable. To the best of
our knowledge, this is the first non-trivial subclass of PTAs, actually even
extended with parametric updates, for which this is possible.
Partially ordered automata are automata where the transition relation induces
a partial order on states. The expressive power of partially ordered automata
is closely related to the expressivity of fragments of first-order logic on
finite words or, equivalently, to the language classes of the levels of the
Straubing-Thérien hierarchy. Several fragments (levels) have been intensively
investigated under various names. For instance, the fragment of first-order
formulae with a single existential block of quantifiers in prenex normal form
is known as piecewise testable languages or $J$-trivial languages. These
languages are characterized by confluent partially ordered DFAs or by complete,
confluent, and self-loop-deterministic partially ordered NFAs (ptNFAs for
short). In this paper, we study the complexity of basic questions for several
types of partially ordered automata on finite words; namely, the questions of
inclusion, equivalence, and ($k$-)piecewise testability. The lower-bound
complexity boils down to the complexity of universality. The universality
problem asks whether a system recognizes all words over its alphabet. For
ptNFAs, the complexity of universality decreases if the alphabet is fixed, but
it is open if the alphabet may grow with the number of states. We show that
deciding universality for general ptNFAs is as hard as for general NFAs. Our
proof is a novel and nontrivial extension of our recent construction for
self-loop-deterministic partially ordered NFAs, a […]
This paper presents two decidability results on the validity checking problem
for entailments of symbolic heaps in separation logic with Presburger
arithmetic and arrays. The first result is for a system with arrays and
existential quantifiers. The correctness of the decision procedure is proved
under the condition that sizes of arrays in the succedent are not existentially
quantified. This condition is different from that proposed by Brotherston et
al. in 2017 and one of them does not imply the other. The main idea is a novel
translation from an entailment of symbolic heaps into a formula in Presburger
arithmetic. The second result is the decidability for a system with both arrays
and lists. The key idea is to extend the unroll collapse technique proposed by
Berdine et al. in 2005 to arrays and arithmetic as well as double-linked lists.
We give a number of formal proofs of theorems from the field of computable
analysis. Many of our results specify executable algorithms that work on
infinite inputs by means of operating on finite approximations and are proven
correct in the sense of computable analysis. The development is done in the
proof assistant Coq and heavily relies on the Incone library for information
theoretic continuity. This library is developed by one of the authors and the
paper can be used as an introduction to the library as it describes many of its
most important features in detail. While the ability to have full executability
in a formal development of mathematical statements about real numbers and the
like is not a feature that is unique to the Incone library, its original
contribution is to adhere to the conventions of computable analysis to provide
a general purpose interface for algorithmic reasoning on continuous structures.
The results that provide complete computational content include that the
algebraic operations and the efficient limit operator on the reals are
computable, that certain countably infinite products are isomorphic to spaces
of functions, compatibility of the enumeration representation of subsets of
natural numbers with the abstract definition of the space of open subsets of
the natural numbers, and that continuous realizability implies sequential
continuity. We also formalize proofs of non-computational results that support
the correctness of our definitions. These […]
We present a device for specifying and reasoning about syntax for datatypes,
programming languages, and logic calculi. More precisely, we study a notion of
"signature" for specifying syntactic constructions.
In the spirit of Initial Semantics, we define the "syntax generated by a
signature" to be the initial object -- if it exists -- in a suitable category
of models. In our framework, the existence of an associated syntax to a
signature is not automatically guaranteed. We identify, via the notion of
presentation of a signature, a large class of signatures that do generate a
syntax.
Our (presentable) signatures subsume classical algebraic signatures (i.e.,
signatures for languages with variable binding, such as the pure lambda
calculus) and extend them to include several other significant examples of
syntactic constructions.
One key feature of our notions of signature, syntax, and presentation is that
they are highly compositional, in the sense that complex examples can be
obtained by gluing simpler ones. Moreover, through the Initial Semantics
approach, our framework provides, beyond the desired algebra of terms, a
well-behaved substitution and the induction and recursion principles associated
to the syntax.
This paper builds upon ideas from a previous attempt by Hirschowitz-Maggesi,
which, in turn, was directly inspired by some earlier work of
Ghani-Uustalu-Hamana and Matthes-Uustalu.
The main results presented in the paper are computer-checked […]
We show that it is decidable whether or not a relation on the reals definable
in the structure $\langle \mathbb{R}, +,<, \mathbb{Z} \rangle$ can be defined
in the structure $\langle \mathbb{R}, +,<, 1 \rangle$. This result is achieved
by obtaining a topological characterization of $\langle \mathbb{R}, +,<, 1
\rangle$-definable relations in the family of $\langle \mathbb{R}, +,<,
\mathbb{Z} \rangle$-definable relations and then by following Muchnik's
approach of showing that the characterization of the relation $X$ can be
expressed in the logic of $\langle \mathbb{R}, +,<,1, X \rangle$.
The above characterization allows us to prove that there is no intermediate
structure between $\langle \mathbb{R}, +,<, \mathbb{Z} \rangle$ and $\langle
\mathbb{R}, +,<, 1 \rangle$. We also show that a $\langle \mathbb{R}, +,<,
\mathbb{Z} \rangle$-definable relation is $\langle \mathbb{R}, +,<, 1
\rangle$-definable if and only if its intersection with every $\langle
\mathbb{R}, +,<, 1 \rangle$-definable line is $\langle \mathbb{R}, +,<, 1
\rangle$-definable. This gives a noneffective but simple characterization of
$\langle \mathbb{R}, +,<, 1 \rangle$-definable relations.
We develop an interface-modeling framework for quality and resource
management that captures configurable working points of hardware and software
components in terms of functionality, resource usage and provision, and quality
indicators such as performance and energy consumption. We base these aspects on
partially-ordered sets to capture quality levels, budget sizes, and functional
compatibility. This makes the framework widely applicable and domain
independent (although we aim for embedded and cyber-physical systems). The
framework paves the way for dynamic (re-)configuration and multi-objective
optimization of component-based systems for quality- and resource-management
purposes.
The recursive path ordering is an established and crucial tool in term
rewriting to prove termination. We revisit its presentation by means of some
simple rules on trees (or corresponding terms) equipped with a 'star' as
control symbol, signifying a command to make that tree (or term) smaller in the
order being defined. This leads to star games that are very convenient for
proving termination of many rewriting tasks. For instance, using already the
simplest star game on finite unlabeled trees, we obtain a very direct proof of
termination of the famous Hydra battle, direct in the sense that there is not
the usual mention of ordinals. We also include an alternative road to setting
up the star games, using a proof method of Buchholz, adapted by van Oostrom,
resulting in a quantitative version of the star as control symbol. We conclude
with a number of questions and future research directions.
We present a Kleene realizability semantics for the intensional level of the
Minimalist Foundation, for short mtt, extended with inductively generated
formal topologies, Church's thesis and axiom of choice. This semantics is an
extension of the one used to show consistency of the intensional level of the
Minimalist Foundation with the axiom of choice and formal Church's thesis in
previous work. A main novelty here is that such a semantics is formalized in a
constructive theory represented by Aczel's constructive set theory CZF extended
with the regular extension axiom.
We introduce a notion of strong proximity join-semilattice, a predicative
notion of continuous lattice which arises as the Karoubi envelop of the
category of algebraic lattices. Strong proximity join-semilattices can be
characterised by the coalgebras of the lower powerlocale on the wider category
of proximity posets (also known as abstract bases or R-structures). Moreover,
locally compact locales can be characterised in terms of strong proximity
join-semilattices by the coalgebras of the double powerlocale on the category
of proximity posets. We also provide more logical characterisation of a strong
proximity join-semilattice, called a strong continuous finitary cover, which
uses an entailment relation to present the underlying join-semilattice. We show
that this structure naturally corresponds to the notion of continuous lattice
in the predicative point-free topology. Our result makes the predicative and
finitary aspect of the notion of continuous lattice in point-free topology more
explicit.
A number of categories is presented that are algebraically complete and
cocomplete, i.e., every endofunctor has an initial algebra and a terminal
coalgebra. For all finitary (and, more generally, all precontinuous) set
functors the initial algebra and terminal coalgebra are proved to carry a
canonical partial order with the same ideal CPO-completion. And they also both
carry a canonical ultrametric with the same Cauchy completion.
We uncover privacy vulnerabilities in the ICAO 9303 standard implemented by
ePassports worldwide. These vulnerabilities, confirmed by ICAO, enable an
ePassport holder who recently passed through a checkpoint to be reidentified
without opening their ePassport. This paper explains how bisimilarity was used
to discover these vulnerabilities, which exploit the BAC protocol - the
original ICAO 9303 standard ePassport authentication protocol - and remains
valid for the PACE protocol, which improves on the security of BAC in the
latest ICAO 9303 standards. In order to tackle such bisimilarity problems, we
develop here a chain of methods for the applied $\pi$-calculus including a
symbolic under-approximation of bisimilarity, called open bisimilarity, and a
modal logic, called classical FM, for describing and certifying attacks.
Evidence is provided to argue for a new scheme for specifying such
unlinkability problems that more accurately reflects the capabilities of an
attacker.
We will extend the well-known Church encoding of Boolean logic into
$\lambda$-calculus to an encoding of McCarthy's $3$-valued logic into a
suitable infinitary extension of $\lambda$-calculus that identifies all
unsolvables by $\bot$, where $\bot$ is a fresh constant. This encoding refines
to $n$-valued logic for $n\in\{4,5\}$. Such encodings also exist for Church's
original $\lambda\mathbf{I}$-calculus. By way of motivation we consider
Russell's paradox, exploiting the fact that the same encoding allows us also to
calculate truth values of infinite closed propositions in this infinitary
setting.