MATHEMATICAL THEORY OF ADAPTIVE CONTROL
|
MATHEMATICAL THEORY OF ADAPTIVE CONTROL
by Vladimir G Sragovich (Russian Academy of Science, Russia) translated by I A Sinitzin (Russian Academy of Science, Russia) edited by J Spaliński (Warsaw University of Technology, Poland), with assistance from Ł Stettner & J Zabczyk (Polish Academy of Sciences, Poland)
The theory of adaptive control is concerned with construction of strategies so that the controlled system behaves in a desirable way, without assuming the complete knowledge of the system. The models considered in this comprehensive book are of Markovian type. Both partial observation and partial information cases are analyzed. While the book focuses on discrete time models, continuous time ones are considered in the final chapter. The book provides a novel perspective by summarizing results on adaptive control obtained in the Soviet Union, which are not well known in the West. Comments on the interplay between the Russian and Western methods are also included.
Contents:
- Basic Notions and Definitions
- Real-Valued HPIV with Finite Number of Controls: Automaton Approach
- Stochastic Approximation
- Minimax Adaptive Control
- Controlled Finite Homogeneous Markov Chains
- Control of Partially Observable Markov Chains and Regenerative Processes
- Control of Markov Processes with Discrete Time and Semi-Markov Processes
- Control of Stationary Processes
- Finite-Converging Procedures for Control Problems with Inequalities
- Control of Linear Difference Equations
- Control of Ordinary Differential Equations
- Control of Stochastic Differential Equations
View Full Text (3,672 KB)
Readership: Graduate students, researchers and academics in
mathematical control theory.
|
|
492pp
Pub. date: Dec 2005
eISBN 978-981-270-103-9
US$114
|
|
|
|
|
|
|
|
|