This paper by Brockett discusses the asymptotic stability and feedback stabilization of control systems. It presents two main theorems that address the existence of stabilizing control laws. The first theorem shows that controllability does not guarantee the existence of a stabilizing control law, while the second theorem provides a sufficient condition for the existence of such a law, involving high gain feedback.
The paper begins with an introduction that highlights the importance of asymptotic stability in control systems. It then provides background on control theory, including the concept of input-linear systems and linear control systems. The paper then discusses the nonexistence of stabilizing control laws, showing that even in systems that are strongly controllable, a continuous feedback control law may not exist that ensures global asymptotic stability.
The paper then presents a theorem that provides a necessary condition for the existence of a stabilizing control law. This condition involves the linearized system and the ability to steer the system to the equilibrium point. The paper also discusses the concept of finite gain and its implications for stability.
The paper concludes with a theorem that provides a sufficient condition for the existence of a stabilizing control law. This condition involves the ability to find a pair of control laws that ensure the system's stability. The paper also provides an example of a system that fails to have a stabilizing control law, demonstrating the importance of the conditions outlined in the theorems.
The paper emphasizes the importance of understanding the critical cases in stability analysis, where the linearized system has eigenvalues with zero real parts. These cases are particularly challenging and require careful analysis to determine the existence of a stabilizing control law. The paper concludes by highlighting the significance of these results in the broader context of control theory and their implications for the design of stabilizing control laws.This paper by Brockett discusses the asymptotic stability and feedback stabilization of control systems. It presents two main theorems that address the existence of stabilizing control laws. The first theorem shows that controllability does not guarantee the existence of a stabilizing control law, while the second theorem provides a sufficient condition for the existence of such a law, involving high gain feedback.
The paper begins with an introduction that highlights the importance of asymptotic stability in control systems. It then provides background on control theory, including the concept of input-linear systems and linear control systems. The paper then discusses the nonexistence of stabilizing control laws, showing that even in systems that are strongly controllable, a continuous feedback control law may not exist that ensures global asymptotic stability.
The paper then presents a theorem that provides a necessary condition for the existence of a stabilizing control law. This condition involves the linearized system and the ability to steer the system to the equilibrium point. The paper also discusses the concept of finite gain and its implications for stability.
The paper concludes with a theorem that provides a sufficient condition for the existence of a stabilizing control law. This condition involves the ability to find a pair of control laws that ensure the system's stability. The paper also provides an example of a system that fails to have a stabilizing control law, demonstrating the importance of the conditions outlined in the theorems.
The paper emphasizes the importance of understanding the critical cases in stability analysis, where the linearized system has eigenvalues with zero real parts. These cases are particularly challenging and require careful analysis to determine the existence of a stabilizing control law. The paper concludes by highlighting the significance of these results in the broader context of control theory and their implications for the design of stabilizing control laws.