License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ICALP.2020.3
URN: urn:nbn:de:0030-drops-124103
Go to the corresponding LIPIcs Volume Portal

Kiefer, Stefan ; Mayr, Richard ; Shirmohammadi, Mahsa ; Totzke, Patrick ; Wojtczak, Dominik

How to Play in Infinite MDPs (Invited Talk)

LIPIcs-ICALP-2020-3.pdf (0.5 MB)


Markov decision processes (MDPs) are a standard model for dynamic systems that exhibit both stochastic and nondeterministic behavior. For MDPs with finite state space it is known that for a wide range of objectives there exist optimal strategies that are memoryless and deterministic. In contrast, if the state space is infinite, optimal strategies may not exist, and optimal or ε-optimal strategies may require (possibly infinite) memory. In this paper we consider qualitative objectives: reachability, safety, (co-)Büchi, and other parity objectives. We aim at giving an introduction to a collection of techniques that allow for the construction of strategies with little or no memory in countably infinite MDPs.

BibTeX - Entry

  author =	{Stefan Kiefer and Richard Mayr and Mahsa Shirmohammadi and Patrick Totzke and Dominik Wojtczak},
  title =	{{How to Play in Infinite MDPs (Invited Talk)}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{3:1--3:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Artur Czumaj and Anuj Dawar and Emanuela Merelli},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{},
  URN =		{urn:nbn:de:0030-drops-124103},
  doi =		{10.4230/LIPIcs.ICALP.2020.3},
  annote =	{Keywords: Markov decision processes}

Keywords: Markov decision processes
Collection: 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)
Issue Date: 2020
Date of publication: 29.06.2020

DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI