Optimal Control: Calculus of Variations, Optimization (mathematics), Control Theory, Continuous Signal, Discrete Time, Dynamic Programming, Bellman Equation, Trajectory Optimization - Softcover

 
9786130306885: Optimal Control: Calculus of Variations, Optimization (mathematics), Control Theory, Continuous Signal, Discrete Time, Dynamic Programming, Bellman Equation, Trajectory Optimization

Synopsis

Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. A control problem includes a cost functional that is a function of state and control variables. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. The optimal control can be derived using Pontryagin''s maximum principle (a necessary condition), or by solving the Hamilton-Jacobi-Bellman equation (a sufficient condition).

"synopsis" may belong to another edition of this title.

Présentation de l'éditeur

Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. A control problem includes a cost functional that is a function of state and control variables. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. The optimal control can be derived using Pontryagin''s maximum principle (a necessary condition), or by solving the Hamilton-Jacobi-Bellman equation (a sufficient condition).

"About this title" may belong to another edition of this title.