Markov Chains and Decision Processes for Engineers and Managers
Auteur : Sheskin Theodore J.
Recognized as a powerful tool for dealing with uncertainty, Markov modeling can enhance your ability to analyze complex production and service systems. However, most books on Markov chains or decision processes are often either highly theoretical, with few examples, or highly prescriptive, with little justification for the steps of the algorithms used to solve Markov models. Providing a unified treatment of Markov chains and Markov decision processes in a single volume, Markov Chains and Decision Processes for Engineers and Managers supplies a highly detailed description of the construction and solution of Markov models that facilitates their application to diverse processes.
Organized around Markov chain structure, the book begins with descriptions of Markov chain states, transitions, structure, and models, and then discusses steady state distributions and passage to a target state in a regular Markov chain. The author treats canonical forms and passage to target states or to classes of target states for reducible Markov chains. He adds an economic dimension by associating rewards with states, thereby linking a Markov chain to a Markov decision process, and then adds decisions to create a Markov decision process, enabling an analyst to choose among alternative Markov chains with rewards so as to maximize expected rewards. An introduction to state reduction and hidden Markov chains rounds out the coverage.
In a presentation that balances algorithms and applications, the author provides explanations of the logical relationships that underpin the formulas or algorithms through informal derivations, and devotes considerable attention to the construction of Markov models. He constructs simplified Markov models for a wide assortment of processes such as the weather, gambling, diffusion of gases, a waiting line, inventory, component replacement, machine maintenance, selling a stock, a charge account, a career path, patient flow
Markov Chain Structure and Models. Regular Markov Chains. Reducible Markov Chains. A Markov Chain with Rewards (MCR). A Markov Decision Process (MDP). Special Topics: State Reduction and Hidden Markov Chains. Index.
Date de parution : 09-2019
15.6x23.4 cm
Date de parution : 11-2010
Ouvrage de 328 p.
15.6x23.4 cm
Thèmes de Markov Chains and Decision Processes for Engineers and... :
Mots-clés :
Transition Probability Matrix; Markov Chain; Regular Markov Chain; Steady State Probability Vector; Recurrent Chain; Steady State Probabilities; Absorbing State; Recurrent States; Transient States; Absorbing Markov Chain; Reward Vector; Transition Matrix; Cash Fl Ow Diagram; Fundamental Matrix; Hidden Markov Chain; Expected Shortage Cost; Fi Rst Passage Time; Infi Nite Horizon; Fi Rst Passage; Discounted Rewards; Fi Nite Planning Horizon; Initial State Probability Vector; Observation Symbol; Planning Horizon; Matrix Reduction