To understand the optimal control concepts and its importance
To study the important optimal control methods existing in the industries in order obtain the required level of control
To introduce the concept of optimal control in various system
To help the learners in the design and the implementation of the concept of optimal control
To study, analyze and implement discrete-Time optimal control system
UNIT I INTRODUCTION 6+6
Introduction to Optimal control – Comparison between the Conventional control and optimal control procedures – Statement of optimal control problem – Problem formulation and forms of optimal Control
– Selection of performance measures. Necessary conditions for optimal control.
UNIT II MATHEMATICAL EVALUATION 6+6
Introduction and Performance Index – Basic Concept of calculus of variation- The basic variational problem – Fixed end point problem – Free end point problem – Variational Approach to Optimal Control
UNIT III CONTROL STRATEGY 6+6
Introduction – Time varying optimal control – LQR steady state optimal control – Frequency Domain Interpretation of LQR (LTI system) – Solution of Ricatti’sequation – Application examples.
UNIT IV PROBLEM FORMATION 6+6
Optimal Control: Introduction, formation of optimal control problem, calculus of variations minimization of functions, constrained optimization. Pontryagin’s Minimum/Maximum Principle, Linear Quadratic Problem-Hamilton Jacobi equation and its solution.
UNIT V ADVANCED SYSTEMS 6+6
Discrete-Time Optimal Control Systems – Matrix Discrete Riccati Equation –
Analytical Solution of Matrix Difference Riccati Equation – Optimal Control Using Dynamic Programming – The Hamilton-Jacobi-Bellman (HJB) Equation – LQR System HJB Equation-Time Optimal Control System.
TOTAL : 60 PERIODS
- Problem formulation, forms of optimal control and its necessary conditions.
- Solving the algebraic equations to design the controller and to study about various problems
- Designing optimal controllers using a class of procedures
- Predict the system dynamic behavior through solution of ODEs and formation of optimal control problem
- Solve equations to design the controllers in discrete methods representing spatial and temporal variations in physical systems through numerical methods.
- Implementing the Optimal control methodology for the benchmark /real time systems.
- Kirk, D.E., Optimal Control Theory, Dover Publications, 2004.
- D.S.Naidu, “Optimal Control Systems” First Indian Reprint, CRC Press, 2009.
- Astrom, K.J. Intro. Stochastic Control Theory, Dover Publications, 2006.
- Gopal M, “Digital Control and State Variable Methods,” Tata McGraw-Hill
- SageA.P.&WhiteC.C.,Optimum Systems Control, PrenticeHall