Joshua Moerman

photo

Email
joshua at cs.rwth-aachen.de
Address
Room 4210
Ahornstraße 55
D-52074 Aachen
Phone
+49 241 80 21203

I am Joshua Moerman, a computer scientist. Starting from April 2019, I am a Post-Doc at the RWTH Aachen in the MOVES group. I am working on probabilistic programs as a member of the FRAPPANT project.

Research

Formerly, I was a PhD student at the Radboud University in Nijmegen at the Software Science department. Under supervision of Frits Vaandrager, Bas Terwijn, and Alexandra Silva I worked on nominal techniques and black box testing for automata learning.

My research interests also include formal verification, PAC learning, coalgebraic methods, functional programming, black-box testing theory, and category theory.

You can also find me on DBLP and Google Scholar. Or you may visit my homepage, on which you can also find my CV.

Bachelor or Master Theses

If you are looking for doing in theses in our group, then I would be happy to supervise you. Here are some ideas for theses. Other topics are of course also possible. Just come by or send an email and then we can talk about it!

  • Learning parametric Markov chains. Imagine, there is some parametric Markov chain, but we have no description of it. Can we infer the structure by just looking at its behaviour? Theoretically, this should be learnable to some extend, but there are no implementations. Topics: theory of weighted automata, probability theory, automata learning, a bit of implementation. Some preliminary (and vague) ideas can be found on my blog.
  • Duality in automata learning. Duality is ubiquitous in computer science. It gives a correspondence between state transformers and predicate transformer, between forward and backward reasoning in Bayesian networks, between modal logic and state spaces, and so on. Can we leverage this for automata learning? Topics: automata learning, duality theory, a bit of implementation
  • Quantitative separation logic with presheaves. This is a harder topic, but I want to give a categorical semantics for QSL. Presheaves have been used for abstract semantics of “contextual computations” – where a context can be a heap for example. Can this be combined with probability theory? Topics: category theory, probability theory, semantics.
  • Learning MDPs with SMT solving. Can we infer the structure of an Markov decision process (up to equivalence) by only observing its behaviour? Topics: probability theory, SMT solving, implementation.
  • What is the probability of finding a counterexample? Black box testing of a finite state machine is the problem of checking equivalence between a specification and implementation by tests only. Surprisingly, this can be done in a sound and complete way (under some assumption), making it a formal method. Can we give quantitative measures to compare different test generation methods? Topics: Black box testing, state machines, probability theory, implementation.
  • Gradient Descent for Parametric Markov Chains. Can gradient descent (or another iterative method) be useful for parameter synthesis? I think so, but we will have to implement it and benchmark it. Topics: Markov chains, model checking, implementation. This could be supervised together with Jip Spel. I have written some preliminary (and vague) theory on my blog.

See also this list.