Limit search to available items
Record 10 of 97
Previous Record Next Record
Book Cover
E-book
Author Guo, Xianping.

Title Continuous-time markov decision processes : theory and applications / Xianping Guo, Onésimo Hernández-Lerma
Published Berlin ; Heidelberg : Springer-Verlag, ©2009

Copies

Description 1 online resource
Series Stochastic Modelling and Applied Probability ; 62
Stochastic modelling and applied probability ; 62.
Contents And Summary -- Continuous-Time Markov Decision Processes -- Average Optimality for Finite Models -- Discount Optimality for Nonnegative Costs -- Average Optimality for Nonnegative Costs -- Discount Optimality for Unbounded Rewards -- Average Optimality for Unbounded Rewards -- Average Optimality for Pathwise Rewards -- Advanced Optimality Criteria -- Variance Minimization -- Constrained Optimality for Discount Criteria -- Constrained Optimality for Average Criteria
Summary Continuous-time Markov decision processes (MDPs), also known as controlled Markov chains, are used for modeling decision-making problems that arise in operations research (for instance, inventory, manufacturing, and queueing systems), computer science, communications engineering, control of populations (such as fisheries and epidemics), and management science, among many other fields. This volume provides a unified, systematic, self-contained presentation of recent developments on the theory and applications of continuous-time MDPs. The MDPs in this volume include most of the cases that arise in applications, because they allow unbounded transition and reward/cost rates. Much of the material appears for the first time in book form
Bibliography Includes bibliographical references and index
Notes English
Print version record
Subject Markov processes.
Distribution (Probability theory)
Markov Chains
distribution (statistics-related concept)
MATHEMATICS -- Probability & Statistics -- Stochastic Processes.
Markov processes.
Markov processes
Form Electronic book
Author Hernández-Lerma, O. (Onésimo)
ISBN 9783642025471
3642025471
3642025463
9783642025464
1282363050
9781282363052
9786612363054
6612363053