This textbook offers a concise yet rigorous introduction to calculus of variations & optimal control theory & is a self-contained resource for graduate students in engineering applied mathematics & related subjects Designed specifically for a one-semester course the book begins with calculus of variations preparing the ground for optimal control It then gives a complete proof of the maximum principle & covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming & linear-quadratic optimal control Calculus of Variations & Optimal Control Theory also traces the historical development of the subject & features numerous exercises notes & references at the end of each chapter & suggestions for further study Offers a concise yet rigorous introduction Requires limited background in control theory or advanced mathematics Provides a complete proof of the maximum principle Uses consistent notation in the exposition of classical & modern topics Traces the historical development of the subject Solutions manual (available only to teachers) Leading universities that have adopted this book include University of Illinois at Urbana-Champaign ECE 553 Optimum Control Systems Georgia Institute of Technology ECE 6553 Optimal Control & Optimization University of Pennsylvania ESE 680 Optimal Control Theory University of Notre Dame EE 60565 Optimal Control