Amazon cover image
Image from Amazon.com

Discrete-Time Markov Chains [electronic resource] : Two-Time-Scale Methods and Applications / by G. George Yin, Qing Zhang.

By: Contributor(s): Material type: TextTextSeries: Stochastic Modelling and Applied Probability, Applications of Mathematics ; 55Publisher: New York, NY : Springer New York, 2005Description: XX, 347 p. online resourceContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9780387268712
Subject(s): Additional physical formats: Printed edition:: No titleDDC classification:
  • 519.2 23
LOC classification:
  • QA273.A1-274.9
  • QA274-274.9
Online resources:
Contents:
Prologue and Preliminaries -- Introduction, Overview, and Examples -- Mathematical Preliminaries -- Asymptotic Properties -- Asymptotic Expansions -- Occupation Measures -- Exponential Bounds -- Interim Summary and Extensions -- Applications -- Stability of Dynamic Systems -- Filtering -- Markov Decision Processes -- LQ Controls -- Mean-Variance Controls -- Production Planning -- Stochastic Approximation.
In: Springer eBooksSummary: Focusing on discrete-time-scale Markov chains, the contents of this book are an outgrowth of some of the authors' recent research. The motivation stems from existing and emerging applications in optimization and control of complex hybrid Markovian systems in manufacturing, wireless communication, and financial engineering. Much effort in this book is devoted to designing system models arising from these applications, analyzing them via analytic and probabilistic techniques, and developing feasible computational algorithms so as to reduce the inherent complexity. This book presents results including asymptotic expansions of probability vectors, structural properties of occupation measures, exponential bounds, aggregation and decomposition and associated limit processes, and interface of discrete-time and continuous-time systems. One of the salient features is that it contains a diverse range of applications on filtering, estimation, control, optimization, and Markov decision processes, and financial engineering. This book will be an important reference for researchers in the areas of applied probability, control theory, operations research, as well as for practitioners who use optimization techniques. Part of the book can also be used in a graduate course of applied probability, stochastic processes, and applications.
Holdings
Item type Current library Collection Call number Status Date due Barcode Item holds
eBook eBook e-Library EBook Available
Total holds: 0

Prologue and Preliminaries -- Introduction, Overview, and Examples -- Mathematical Preliminaries -- Asymptotic Properties -- Asymptotic Expansions -- Occupation Measures -- Exponential Bounds -- Interim Summary and Extensions -- Applications -- Stability of Dynamic Systems -- Filtering -- Markov Decision Processes -- LQ Controls -- Mean-Variance Controls -- Production Planning -- Stochastic Approximation.

Focusing on discrete-time-scale Markov chains, the contents of this book are an outgrowth of some of the authors' recent research. The motivation stems from existing and emerging applications in optimization and control of complex hybrid Markovian systems in manufacturing, wireless communication, and financial engineering. Much effort in this book is devoted to designing system models arising from these applications, analyzing them via analytic and probabilistic techniques, and developing feasible computational algorithms so as to reduce the inherent complexity. This book presents results including asymptotic expansions of probability vectors, structural properties of occupation measures, exponential bounds, aggregation and decomposition and associated limit processes, and interface of discrete-time and continuous-time systems. One of the salient features is that it contains a diverse range of applications on filtering, estimation, control, optimization, and Markov decision processes, and financial engineering. This book will be an important reference for researchers in the areas of applied probability, control theory, operations research, as well as for practitioners who use optimization techniques. Part of the book can also be used in a graduate course of applied probability, stochastic processes, and applications.

Powered by Koha