Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology1 Chapter 13 Internal (Lyapunov) Stability 13.1 Introduction We have already seen some examples of both stable and unstable systems. The objective of this chapter is to formalize the notion of internal stability for general nonlinear state-space models. Apart from defining the various notions of stability, we define an entity known as a Lyapunov function and relate it to these various stability notions. 13.2 Notions of Stability For a general undriven system x(t) = f(x(t),0,t) (CT) (13.1) x(k + l) = f(x(k),0,k) (DT), (13.2) we say that a point x is an equilibrium point from time to for the CT system above if f(x, 0, t) = 0, Vt > to, and is an equilibrium point from time ko for the DT system above if f(x,0,k) = x, Vk > ko- If the system is started in the state x at time to or ko, it will remain there for all time. Nonlinear systems can have multiple equilibrium points (or equilibria). (Another class of special solutions for nonlinear systems are periodic solutions, but we shall just focus on equilibria here.) We would like to characterize the stability of the equilibria in some fashion. For example, does the state tend to return to the equilibrium point after a small perturbation away from it? Does it remain close to the equilibrium point in some sense? Does it diverge? The most fruitful notion of stability for an equilibrium point of a nonlinear system is given by the definition below. We shall assume that the equilibrium point of interest is at the origin, since if x ^ 0, a simple translation can always be applied to obtain an equivalent system with the equilibrium at 0. Definition 13.1 A system is called asymptotically stable around its equilibrium point at the origin if it satisfies the following two conditions: 1. Given any e > 0, 3S\ > 0 such that if ||a;( to. 2. 352 > 0 such that if ||a;( 0 as t —> oo. The first condition requires that the state trajectory can be confined to an arbitrarily small "ball" centered at the equilibrium point and of radius e, when released from an arbitrary initial condition in a ball of sufficiently small (but positive) radius Si. This is called stability in the sense of Lyapunov (i.s.L.). It is possible to have stability in the sense of Lyapunov without having asymptotic stability, in which case we refer to the equilibrium point as marginally stable. Nonlinear systems also exist that satisfy the second requirement without being stable i.s.L., as the following example shows. An equilibrium point that is not stable i.s.L. is termed unstable. Example 13.1 (Unstable Equilibrium Point That Attracts All Trajectories) Consider the second-order system with state variables x\ and x^ whose dynamics are most easily described in polar coordinates via the equations where the radius r is given by r = y' x\ + x\ and the angle 9 by 0 < 9 — arctan (a^/^i) < 27r. (You might try obtaining a state-space description directly involving x\ and X2-) It is easy to see that there are precisely two equilibrium points: one at the origin, and the other at r = 1, 9 = 0. We leave you to verify with rough calculations (or computer simulation from various initial conditions) that the trajectories of the system have the form shown in the figure below. Evidently all trajectories (except the trivial one that starts and stays at the origin) end up at r = 1, 9 = 0. However, this equilibrium point is not stable i.s.L., because these trajectories cannot be confined to an arbitrarily small ball around the equilibrium point when they are released from arbitrary points with any ball (no matter how small) around this equilibrium. 9 r r(l — r) sin2 (0/2) (13.3) Figure 13.1: System Trajectories 13.3 Stability of Linear Systems We may apply the preceding definitions to the LTI case by considering a system with a diagonalizable A matrix (in our standard notation) and u = 0. The unique equilibrium point is at x = 0, provided A has no eigenvalue at 0 (respectively 1) in the CT (respectively DT) case. (Otherwise every point in the entire eigenspace corresponding to this eigenvalue is an equilibrium.) Now x(t) = eAtx(0) r eAi* = V Wx{0) {CT) (13.4) x(k) = Akx(0) r \fc = V A? A Wx(0) {DT) (13.5) Hence, it is clear that in continuous time a system with a diagonalizable A is asymptotically stable iff Ke{\i)<0, *€{l,...,n}, (13.6) while in discrete time the requirement is that |Aj| < 1 i€ {l,...,n}, (13.7) Note that if TZe(Xi) = 0 (CT) or |Aj| = 1 (DT), the system is not asymptotically stable, but is marginally stable. Exercise: For the nondiagonalizable case, use your understanding of the Jordan form to show that the conditions for asymptotic stability are the same as in the diagonalizable case. For marginal stability, we require in the CT case that TZe(Xi) < 0, with equality holding for at least one eigenvalue; furthermore, every eigenvalue whose real part equals 0 should have its geometric multiplicity equal to its algebraic multiplicity, i.e., all its associated Jordan blocks should be of size 1. (Verify that the presence of Jordan blocks of size greater than one for these imaginary-axis eigenvalues would lead to the state variables growing polynomially with time.) A similar condition holds for marginal stability in the DT case. Stability of Linear Time-Varying Systems Recall that the general unforced solution to a linear time-varying system is x(t) = §(t,t0)x{t0), where r) is the state transition matrix. It follows that the system is 1. stable i.s.L. at x = 0 if sup \\$(t, to)\\ = m(to) < oo. t 2. asymptotically stable at x = 0 if lim ||$(<, 0, Vio- t—>oo These conditions follow directly from Definition 13.1. 13.4 Lyapunov's Direct Method General Idea Consider the continuous-time system x(t)=f(x(t)) (13.8) with an equilibrium point at x = 0. This is a time-invariant (or "autonomous") system, since / does not depend explicitly on t. The stability analysis of the equilibrium point in such a system is a difficult task in general. This is due to the fact that we cannot write a simple formula relating the trajectory to the initial state. The idea behind Lyapunov's "direct" method is to establish properties of the equilibrium point (or, more generally, of the nonlinear system) by studying how certain carefully selected scalar functions of the state evolve as the system state evolves. (The term "direct" is to contrast this approach with Lyapunov's "indirect" method, which attempts to establish properties of the equilibrium point by studying the behavior of the linearized system at that point. We shall study this next Chapter.) Consider, for instance, a continuous scalar function V(x) that is 0 at the origin and positive elsewhere in some ball enclosing the origin, i.e. V(0) = 0 and V(x) > 0 for x ^ 0 in this ball. Such a V(x) may be thought of as an "energy" function. Let V(x) denote the time derivative of V(x) along any trajectory of the system, i.e. its rate of change as x(t) varies according to (13.8). If this derivative is negative throughout the region (except at the origin), then this implies that the energy is strictly decreasing over time. In this case, because the energy is lower bounded by 0, the energy must go to 0, which implies that all trajectories converge to the zero state. We will formalize this idea in the following sections. Lyapunov Functions Definition 13.2 Let V be a continuous map from ffira to ffi. We call V(x) a locally positive definite (lpd) function around x = 0 if 1. V(0) = 0. 2. V(x) > 0, 0 < \\x\\ < r for some r. Similarly, the function is called locally positive semidefinite (lpsd) if the strict inequality on the function in the second condition is replaced by V(x) > 0. The function V(x) is locally negative definite (hid) if —V(x) is lpd, and locally negative semidefinite (lnsd) if —V(x) is lpsd. What may be useful in forming a mental picture of an lpd function V(x) is to think of it as having "contours" of constant V that form (at least in a small region around the origin) a nested set of closed surfaces surrounding the origin. The situation for n = 2 is illustrated in Figure 13.2. Figure 13.2: Level lines for a Lyapunov function, where c\ < c 0 is given. We need to find a 5 > 0 such that for all ||x(0)|| < S, it follows that \\x(t)\\ < e, Vt > 0. The Figure 19.6 illustrates the constructions of the proof for the case n = 2. Let e\ = min(e, r). Define Figure 13.3: Illustration of the neighborhoods used in the proof m = min V(x). IMMi Since V(x) is continuous, the above m is well defined and positive. Choose S satisfying 0 < S < e\ such that for all \\x\\ < S, V(x) < m. Such a choice is always possible, again because of the continuity oiV(x). Now, consider any x(0) such that ||#(0)|| < S, V(x(0)) < m, and let x(t) be the resulting trajectory. V(x(t)) is non-increasing (i.e. V(x(t)) < 0) which results in V(x(t)) < m. We will show that this implies that ||#(i)|| < e\. Suppose there exists t\ such that ||a;( ei, then by continuity we must have that at an earlier time ||^(^2)|| = ei, and min||a.||=ei ||V(a;)|| = m > V(x(t2)), which is a contradiction. Thus stability in the sense of Lyapunov holds. To prove asymptotic stability when V is lnd, we need to show that as t —> oo, V(x(t)) —> 0; then, by continuity of V, \\x(t)\\ —> 0. Since V(x(t)) is strictly decreasing, and V(x(t)) > 0 we know that V(x(t)) —> c, with c > 0. We want to show that c is in fact zero. We can argue by contradiction and suppose that c > 0. Let the set S be defined as S = {x£Rn\V(x) c for all t. Therefore, x(t) £ Ba; recall that Ba C S which is defined as all the elements in ffira for which V(x) < c. In the first part of the proof, we have established that if ||#(0)|| < S then ||x(i)|| < e. We can define the largest derivative oiV(x) as —7= max V(x). a<||a;|| 0 if x ^ 0. Convince yourself that the unique equilibrium point of the system is at 0. Now consider the candidate Lyapunov function V(x) = x'x which satisfies all the desired properties, including |V(a;)| / oo as \\x\\ /* oo. Evaluating its derivative along trajectories, we get V(x) = 2x'x = -2x'C(x) < 0 for x ^ 0 Hence, the system is globally asymptotically stable. Example 13.4 Consider the following dynamical system X2 -X\ + 4x2 -X\ x\. The only equlibrium point for this system is the origin x = 0. To investigate the stability of the origin let us propose a quadratic Lyapunov function V = x\ + ax\, where a is a positive constant to be determined. It is clear that V is positive definite on the entire state space 1R2. In addition, V is radially unbounded, that is it satisfies |V(a;)| / oo as \\x\\ /* oo. The derivative of V along the trajectories of the system is given by V = [ 2zi 2ax2 ] = -2xf + -X\ + 4x2 -x\ - x\ 2a)xxX2 — 2ax\. If we choose a = 4 then we can eliminate the cross term xxX2, and the derivative of V becomes V -2x\ 8xi which is clearly a negative definite function on the entire state space. Therefore we conclude that x = 0 is a globally asymptotically stable equilibrium point. Example 13.5 A highly studied example in the area of dynamical systems and chaos is the famous Lorenz system, which is a nonlinear system that evolves in IE3 whose equations are given by x = a(y — x) y = rx — y — xz z = xy — bz, where a, r and b are positive constants. This system of equations provides an approximate model of a horizontal fluid layer that is heated from below. The warmer fluid from the bottom rises and thus causes convection currents. This approximates what happens in the atmosphere. Under intense heating this model exhibits complex dynamical behaviour. However, in this example we would like to analyze the stability of the origin under the condition r < 1, which is known not to lead to complex behaviour. Le us define V = axx2+o>2y2+a>3Z2, where ax, «2, and «3 are positive constants to be determined. It is clear that V is positive definite on IE3 and is radially unbounded. The derivative of V along the trajectories of the system is given by V = [ 2o>xx 2o>2y 2a>zz j rx — y xy ■ xz bz = — 2a\ux2 — 2«2j/2 — 2a$bz2 +xy(2a\a + 2ra2) + (2«3 - 2a2)xyz. If we choose «2 = «3 = 1 and a.\ = \ then the V becomes V = -2 [x2 + y2 + 2bz2 - (1 + r)xyj = -2 ,1 + r\ Since 0 < r < 1 it follows that 0 < -4r- < 1 and therefore V is negative definite on the entire state space 1R3. This implies that the origin is globally asymptotically stable. Example 13.6 (Pendulum) The dynamic equation of a pendulum comprising a mass M at the end of a rigid but massless rod of length R is MRO + Mg sin 9 = 0 where 9 is the angle made with the downward direction, and g is the acceleration due to gravity. To put the system in state-space form, let x\ = 9, and x^ = 9; then Xl X2 — sinn Take as a candidate Lyapunov function the total energy in the system. Then V(x) = -MR2X2 + MgR(l - cos xx) = kinetic + potential = [MgRs'mxi MR2x2] = 0 X2 -jj sina^i Hence, V is a Lyapunov function and the system is stable i.s.L. We cannot conclude asymptotic stability with this analysis. Consider now adding a damping torque proportional to the velocity, so that the state-space description becomes x\ = X2 = —Dx2 — -7: sina^i With this change, but the same V as before, we find V = -DMR2x\ < 0. From this we can conclude stability i.s.L. We still cannot directly conclude asymptotic stability. Notice however that V = 0 9 = 0. Under this condition, 9 = —(g/R) sm9. Hence, 9 ^ 0 if 9 ^ kit for integer k, i.e. if the pendulum is not vertically down or vertically up. This implies that, unless we are at the bottom or top with zero velocity, we shall have 9 ^ 0 when V = 0, so 9 will not remain at 0, and hence the Lyapunov function will begin to decrease again. The only place the system can end up, therefore, is with zero velocity, hanging vertically down or standing vertically up, i.e. at one of the two equilibria. The formal proof of this result in the general case ("LaSalle's invariant set theorem") is beyond the scope of this course. The conclusion of local asymptotic stability can also be obtained directly through an alternative choice of Lyapunov function. Consider the Lyapunov function candidate V(x) = -x\ + -{xi+x2f + 2(1-cos xi). It follows that V = -(x\ +xisinxi) =--{92 + 9sm9) < 0. Also, 92 + 9sm9 = 0 92 = 0, 6>sin6> = 0 9 = 0, 0 = 0. Hence, V is strictly negative in a small neighborhood around 0. This proves asymptotic stability. Discrete-Time Systems Essentially identical results hold for the system x(k + l)=f(x(k)) (13.9) provided we interpret V as V{x)^V{f{x))-V{x), i.e. as F(next state) — F(present state) Example 13.7 (DT System) Consider the system xi{k + l) = x2{k + l) = l + x2(k) xi(k) l + x2(k) which has its only equilibrium at the origin. If we choose the quadratic Lyapunov function V(x) =x\ + xl we find V(x(k)) = V(x(k)) (^pjp -1) < o from which we can conclude that the equilibrium point is stable i.s.L. In fact, examining the above relations more carefully (in the same style as we did for the pendulum with damping), it is possible to conclude that the equilibrium point is actually globally asymptotically stable. Notes The system in Example 2 is taken from the eminently readable text by F. Verhulst, Nonlinear Differential Equations and Dynamical Systems, Springer-Verlag, 1990. Exercises Exercise 13.1 Consider the horizontal motion of a particle of unit mass sliding under the influence of gravity on a frictionless wire. It can be shown that, if the wire is bent so that its height h is given by h(x) = Va(x), then a state-space model for the motion is given by x = z Suppose Va(x) = xA — ax2. (a) Verify that the above model has (z,x) = (0,0) as equilibrium point for any a in the interval —1 < a < 1, and it also has (z,x) = ^0,±^J^j as equilibrium points when a is in the interval 0 < a < 1. (b) Verify that the linearized model about any of the equilibrium points is neither asymptotically stable nor unstable for any a in the interval — 1 < a < 1. Exercise 13.2 Consider the dynamic system described below: y + aiy + a2y + cy2 = u + u, where y is the output and u is the input. (a) Obtain a state-space realization of dimension 2 that describes the above system. (b) If a\ = 3, (22 = 2, c = 2, show that the system is asymptotically stable at the origin. (c) Find a region (a disc of non-zero radius) around the origin such that every trajectory, with an initial state starting in this region, converges to zero as t approaches infinity. This is known as a region of attraction. Exercise 13.3 Consider the system where P{x) has continuous first partial derivatives. The function P{x) is referred to as the potential {unction of the system, and the system is said to be a gradient system. Let x be an isolated local minimum of P(x), i.e. P(x) < P(x) for 0 < \\x — x\\ < r, some r. (a) Show that x is an equilibrium point of the gradient system. (b) Use the candidate Lyapunov function V{x) = P{x) - P{x) to try and establish that x is an asymptotically stable equilibrium point. Exercise 13.4 The objective of this problem is to analyze the convergence of the gradient algorithm for finding a local minimum of a function. Let / : ffi™ —> ffi and assume that x* is a local minimum; i.e., f(x*) < f(x) for all x close enough but not equal to x*. Assume that / is continuously differentiable. Let gT : ffi ->■ ffi™ be the gradient of /: gT = (p- ... It follows from elementary Calculus that g(x*) = 0. If one has a good estimate of x*, then it is argued that the solution to the dynamic system: x = -g(x) (13.10) with x(0) close to x* will give x(t) such that lim x(t) = x*. t—>oo (a) Use Lyapunov stability analysis methods to give a precise statement and a proof of the above argument. (b) System 13.10 is usually solved numerically by the discrete-time system x(k + 1) = x(k) - a(xk)g{xk), (13.11) where a(xk) is some function from ffi™ —>■ ffi. In certain situations, a can be chosen as a constant function, but this choice is not always good. Use Lyapunov stability analysis methods for discrete-time systems to give a possible choice for a(xk) so that lim x(k + 1) = x*. k—too (c) Analyze directly the gradient algorithm for the function f(x) = -xTQx, Q Symmetric, Positive Definite. Show directly that system 13.10 converges to zero (= x*). Also, show that a in system 13.11 can be chosen as a real constant, and give tight bounds on this choice. Exercise 13.5 (a) Show that any (possibly complex) square matrix M can be written uniquely as the sum of a Hermitian matrix H and a skew-Hermitian matrix S, i.e. H' = H and S' = —S. (Hint: Work with combinations of M and M'.) Note that if M is real, then this decomposition expresses the matrix as the sum of a symmetric and skew-symmetric matrix. (b) With M, H, and S as above, show that the real part of the quadratic form x'Mx equals x'Hx, and the imaginary part of x'Mx equals x'Sx. (It follows that if M and x are real, then x'Mx = x'Hx.) (c) Let V(x) = x'Mx for real M and x. Using the standard definition of dV(x)/dx as a Jacobian matrix — actually just a row vector in this case — whose jth entry is dV(x)/dxj, show that = 2x,H dx where H is the symmetric part of M, as defined in part (a). (d) Show that a Hermitian matrix always has real eigenvalues, and that the eigenvectors associated with distinct eigenvalues are orthogonal to each other. Exercise 13.6 Consider the (real) continuous-time LTI system x(t) = Ax(t). (a) Suppose the (continuous-time) Lyapunov equation PA + A'P = -I (3.1) has a symmetric, positive definite solution P. Note that (3.1) can be written as a linear system of equations in the entries of P, so solving it is in principle straightforward; good numerical algorithms exist. Show that the function V(x) = x'Px serves as a Lyapunov function, and use it to deduce the global asymptotic stability of the equilibrium point of the LTI system above, i.e. to deduce that the eigenvalues of A are in the open left-half plane. (The result of Exercise 13.5 will be helpful in computing V(x).) What part (a) shows is that the existence of a symmetric, positive definite solution of (3.1) is sufficient to conclude that the given LTI system is asymptotically stable. The existence of such a solution turns out to also be necessary, as we show in what follows. [Instead of —I on the right side of (3.1), we could have had — Q for any positive definite matrix Q. It would still be true that the system is asymptotically stable if and only if the solution P is symmetric, positive definite. We leave you to modify the arguments here to handle this case.] (b) Suppose the LTI system above is asymptotically stable. Now define /•oo P = / R(t)dt , R(t) = eA,teM (3.2) Jo The reason the integral exists is that the system is asymptotically stable — explain this in more detail! Show that P is symmetric and positive definite, and that it is the unique solution of the Lyapunov equation (3.1). You will find it helpful to note that R{oo) - R{0) = [ Jo d_mdt dt The results of this problem show that one can decide whether a matrix A has all its eigenvalues in the open left-half plane without solving for all its eigenvalues. We only need to test for the positive definiteness of the solution of the linear system of equations (3.1). This can be simpler. Exercise 13.7 This problem uses Lyapunov's direct method to justify a key claim of his indirect method: if the linearized model at an equilibrium point is asymptotically stable, then this equilibrium point of the nonlinear system is asymptotically stable. (We shall actually only consider an equilibrium point at the origin, but the approach can be applied to any equilibrium point, after an appropriate change of variables.) Consider the time-invariant continuous-time nonlinear system given by x(t) = Ax(t) + h(x(t)) (4.1) where A has all its eigenvalues in the open left-half plane, and h(.) represents "higher-order terms", in the sense that HM^OII/IMI 0 as \\x\\ ~^ 0- (a) Show that the origin is an equilibrium point of the system (4.1), and that the linearized model at the origin is just x(t) = Ax(t). (b) Let P be the positive definite solution of the Lyapunov equation in (3.1). Show that V(x) = x'Px qualifies as a candidate Lyapunov function for testing the stability of the equilibrium point at the origin in the system (4.1). Determine an expression for V(x), the rate of change of V(x) along trajectories of (4.1) (c) Using the fact that x'x = \\x\\2, and that ||P/i(x)|| < ||P||||/i(a;)||, how small a value (in terms of ||P||) of the ratio HM^OH/INI wil1 allow You to conclude that V(x(t)) < 0 for x(t) / 0? Now argue that you can indeed limit HM^OII/IMI to this small a value by choosing a small enough neighborhood of the equilibrium. In this neighborhood, therefore, V(x(t)) < 0 for x(t) / 0. By Lyapunov's direct method, this implies asymptotic stability of the equilibrium point. Exercise 13.8 For the discrete-time LTI system x(k + 1) = Ax(k), let V(x) = x'Px, where P is a symmetric, positive definite matrix. What condition will guarantee that V(x) is a Lyapunov function for this system? What condition involving A and P will guarantee asymptotic stability of the system? (Express your answers in terms of the positive semidefiniteness and definiteness of a matrix.) MIT OpenCourseWare http://ocw.mit.edu 6.241J / 16.338J Dynamic Systems and Control Spring 2011 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.