Mathematical Theory of Control Systems Design
Springer (Verlag)
978-0-7923-3724-9 (ISBN)
I. Continuous and Discrete Deterministic Systems.- II. Stability of Stochastic Systems.- III. Description of Control Problems.- IV. The Classical Calculus of Variations and Optimal Control.- V. The Maximum Principle.- VI. Linear Control Systems.- VII. Dynamic Programming Approach. Sufficient Conditions for Optimal Control.- VIII. Some Additional Topics of Optimal Control Theory.- IX. Control of Stochastic Systems. Problem Statements and Investigation Techniques.- X. Optimal Control on a Time Interval of Random Duration.- XI. Optimal Estimation of the State of the System.- XII. Optimal Control of the Observation Process.- XIII. Linear Time-Invariant Control Systems.- XIV. Numerical Methods for the Investigation of Nonlinear Control Systems.- XV. Numerical Design of Optimal Control Systems.- General References.
Erscheint lt. Verlag | 31.1.1996 |
---|---|
Reihe/Serie | Mathematics and Its Applications ; 341 | Mathematics and Its Applications ; 341 |
Zusatzinfo | XXIV, 672 p. |
Verlagsort | Dordrecht |
Sprache | englisch |
Maße | 156 x 234 mm |
Themenwelt | Mathematik / Informatik ► Mathematik ► Analysis |
Mathematik / Informatik ► Mathematik ► Angewandte Mathematik | |
Mathematik / Informatik ► Mathematik ► Finanz- / Wirtschaftsmathematik | |
Technik ► Elektrotechnik / Energietechnik | |
ISBN-10 | 0-7923-3724-7 / 0792337247 |
ISBN-13 | 978-0-7923-3724-9 / 9780792337249 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich