Programming languages suitable for task-level specifications of robot behavior have been investigated and used for many years. A prominent example is the Golog family of languages, which offers the ability to combine the familiar constructs from logic programming with non-determinism, thus allowing a user to flexibly mix prescribing exactly what a robot should do with letting the robot choose the right course of actions to fulfill a given task. Such flexibility is needed in particular in unstructured environments, where not all eventualities can be foreseen by the programmer. As the deployment of robots, especially in environments shared by humans, is usually costly, complex and often involves safety-critical issues, verifying beforehand that a robot program satisfies certain (temporal) properties during execution is highly desirable. The problem is compounded by the fact that a robot’s actuators and sensors are inherently noisy and thus fraught with uncertainty. While this means that absolute guarantees for a desired behavior can usually not be given, it is nevertheless of significant interest to verify that certain properties have a certain likelihood.
This dissertation project aims to develop a decision-theoretic variant of Golog under partial observability and investigate the verification problem for this language. Golog is chosen because it is among the most expressive languages available, equipped with a rigorous, logic-based semantics and arguably the best understood following the seminal work by Reiter. Since Golog is based on first-order logic, verification in general is undecidable. Recent work has identified useful fragments where the problem becomes decidable, leveraging existing work on model checking. Golog was previously extended to account explicitly for stochastic actions enabling a form of decision-theoretic planning under full observability as part of a Golog program. Currently, work is under way to consider verification for such programs. However, the important aspect of partial observability, which necessitates an explicit account of (noisy) sensors, has not yet been considered.
Specific goals of this dissertation project are:
- Develop a decision-theoretic Golog variant with partial observability based on existing work.
- Define the verification problem for probabilistic temporal property specification languages such as PCTL or PCTL*. Some progress has recently been made in this direction for LTL and CTL* properties. However, noisy sensors and actions with probabilistic effects have not been addressed so far.
- Identify decidable fragments.
- Develop and implement verification algorithms based on probabilistic model checking techniques.
- Test and evaluate the methodology using example scenarios for service robots in home environments, in collaboration with the robotics lab.
(This dissertation project is related to the projects VE3, VE4, VE5, and LO4.)