In control theory, a control-Lyapunov function (CLF) is an extension of the idea of Lyapunov function V ( x ) {\displaystyle V(x)} to systems with control inputs. The ordinary Lyapunov function is used to test whether a dynamical system is (Lyapunov) stable or (more restrictively) asymptotically stable. Lyapunov stability means that if the system starts in a state x ≠ 0 {\displaystyle x\neq 0} in some domain D, then the state will remain in D for all time. For asymptotic stability, the state is also required to converge to x = 0 {\displaystyle x=0} . A control-Lyapunov function is used to test whether a system is asymptotically stabilizable, that is whether for any state x there exists a control u ( x , t ) {\displaystyle u(x,t)} such that the system can be brought to the zero state asymptotically by applying the control u.
The theory and application of control-Lyapunov functions were developed by Zvi Artstein and Eduardo D. Sontag in the 1980s and 1990s.