Friday, March 18, 2011

Why eigenvalue analysis?



Eigenvalue of Jacobian Matrix is used in stability analysis. So here we encounter three concepts, "eigenvalue", "Jacobian" and "stability". I will start with "stability", since the other two are the tools to achieve this final goal: answer whether the system is stable or not at some point?

(1) Stability
Stability analysis is to find out the answer to that whether or not the system is stable at some operating point. As time goes to infinity, if the system is stable, then it will achieve a equilibrium or the system state will asymptotically approach a fixed point.

There are stable and non-stable equilibriums, imagine a ball in a bowl, or on top of a bowl. we ask: can small disturbance to this point stir something big happen? If it cannot, it literally means the system can still go back to this equilibrium (bowl bottom point); On the contrary, if it indeed enlarges the deviations and have no intention to stop the trend, eventually lead to another system state, then it is a non-stable equilibrium (bowl top point).

How to determine the stability of an equilibrium?

  • Step 1: Find the equilibrium point, let x(t+1)=x(t), or dx/dt=0!
  • Step 2: Linearization, [f(x)-f(a)]/[x-a]=f'(x)|a=f'(a) or Jacobian Matrix
  • Step 3: Check the value of f'(a) or eigenvalue of Jacobian Matrix
  • Step 4: final judge: if f'(a)>1, unstable; f'(a)<1, stable; Jacobian matrix, eigenvalue with positive real part, unstable; with negative real part, stable;
Now I will give a simple example.
For a dynamic system
xt+1=f(xt)
Now explain why at a particular point c, f'(c)>1, it is unstable; f'(c)<1, stable.
Look at the figure below:


 There are two equilibriums, a and b, a is stable and b is unstable. Why? If we move a a little bit larger to blue xt , we get xt+1<xt,  meaning it moves back (the other direction).

a-------------------------------------------------> xt
                                 xt+1<---------------------

But, if we move b a little larger to red xt, then at the next time step, xt+1>xt, even larger!!! It will finally explode x|t=infinity.

b---------------------------------------------->xt
                                                             ------------------------->xt+1

 Note f'(a)<1, f'(b)>1 (we know that g'(x)=1)
OK now. We have proved it graphically. Now we will prove it in mathematical form.
 As an equilibrium, xt+1=xt, meaning f(xt)=xt=g(xt). So the intersection points of the two curves are the equilibriums.

f(a)=a=g(a);
f(b)=b=g(b);

we move a to xt (xt>a), then we need to prove
xt+1<xt ==> f(xt)<xt ==> f(xt)<g(xt) ==>f(xt)-f(a)<g(xt)-g(a) ==> [f(xt)-f(a)]/[xt-a]<[g(xt)-g(a)]/[xt-a]==>f'(a)<g'(a)

We know that g'(a)=1, so if f'(a)<1, the reverse induction is valid, and hence xt+1<xt, go back, stable!!

Same induction applies to point b

So the final time-domain result plots are as follows:



(2) Multi-variable system, eigenvalue and Jacobian matrix
Now let's consider there are multiple input and output system,
We need to find eqiliriums first, let  

that is f1(x1,x2)=0, f2(x1,x2)=0, (those can be linear or non-linear) then we obtain the equilibriums.  
We need to find whether these equiliriums are stable.


Our previous assertion then is to find the Jacobian Matrix at the specified equilibrium. If all the eigenvalues have negative real part, then the equilibrium is stable!

Questions:
(1) What is Jacobian Matrix?
Answer: Jacobian matrix is a first-order derivative of functions with respect to variables, which is corresponding to a slope in a single-variable function.
In this case, Jacobian matrix is as follows (a denotes the equilibrium point)
J|a=[df1/dx1, df1/dx2;]
      [df2/dx1, df2/dx2]  |a


(2) What is eigenvalue?
If J is multiplied with vector v, J can be viewed as a linear transformer of v. J basically stretches and rotate v. If v is only stretched or shrunk, without changing direction, then v is a eigenvector of J, meaning it is owned by J. The scale of stretching and shrinking then is called eigenvalue! Those can be expressed as

tex:Jv=\lambda v

Eigenvalues can be found by solving

tex:det|J-\lambda I|=0

OK. Now that we have cleared the above two important concept, we now ask why eigenvalue especially its real part matters?

If we are at an arbitrary point (system state) that is described by the differential equation tex: \dot{x}=f(x), how do we find the solution for each state. To put it another way, what are xs at time t?

To solve differential equations, we use Laplace Transform
But what is this about?

Laplace Transform is extended from Fourier Transform. Fourier says any mathematical function can be decomposed to linear combination of sinusoidal functions with different magnitude and frequencies. He then applied Euler's formula and established this famous transform:

Euler's formula: e^{ix}=cosx + isinx

Fourier's Transform
Fourier's transform possesses a clear physical meaning, but also has some weakness when applied to non-integral functions. Therefore, Laplace extend this transform by add a factor so that the function will always be damping or decaying with time. This has become famous Laplace's Transform.



Note s=sigma t+ jwt, a complex number.

If we solve the system differential equations, we usually get solutions somewhat like
where lambda is the eigenvalue of the Jacobian matrix. If it has negative real part, it means xt will decay to some equilibrium point, stable! Conversely, if it has positive real part, xt will explode as time goes to infinity!!!!

No comments:

Post a Comment