Notes:
This journal paper originally appeared as the technical report [KKNP08c].
The original publication is available at link.springer.com.
|
Abstract.
In the field of model checking, abstraction refinement has proved to be an extremely
successful methodology for combating the state-space explosion problem. However, little
practical progress has been made in the setting of probabilistic verification. In this
paper we present a novel abstraction-refinement framework for
Markov decision processes (MDPs), which are widely used for modelling and verifying systems that exhibit both
probabilistic and nondeterministic behaviour. Our framework comprises an abstraction
approach based on stochastic two-player games, two refinement methods and an efficient
algorithm for an abstraction-refinement loop. The key idea behind the abstraction
approach is to maintain a separation between nondeterminism present in the original MDP
and nondeterminism introduced during the abstraction process, each type being represented
by a different player in the game. Crucially, this allows lower and upper bounds to be
computed for the values of reachability properties of the MDP.
These give a quantitative measure of the quality of the abstraction
and form the basis of the corresponding refinement methods. We describe a prototype
implementation of our framework and present experimental results demonstrating automatic
generation of compact, yet precise, abstractions for a large selection of real-world case
studies.
|