Thursday, 14 May 2020

Eigen values and vectors to Principle axes

As a part of BITS WILP MFDS Session 3 I learned about eigen values and vectors.

My Take away from the session are as follows.
1. Few Jargons
2. How to calculate Eigen values and Eigen vectors? - NAIVE METHOD
3. Applications of Eigen values and Eigen vectors.
4. Non intuitive but important theorems
5. How to search for Eigen Values in large matrices with GERSCHGORIN THEOREM
6. How to identify the defect or linear dependent?
7. How to identify a basis from Eigen vectors?
8. How to Diagonalize a matrix with similarity theorem and eigen vectors as basis?
9. How to get Power of matrix A with Diagonalized Matrix?
10. Orthogonal, Orthonormal and Normalization.
11. How to determine principle axes or canonical form, if A is a Symmetry and an take Quadratic form with eigen values as basis eigen vectors will be orthonormal (orthogonal and unit vector)?

Few Jargons

1. Eigen Values - A scaling factor
2. Eigen Vectors - A Vector which does not change its direction (or does not rotate) despite the shape of the matrix sees a transformation.
Eigen if has to be said in English It can be said as "its own". Eigen is a German word. 
By Eigen values, we mean its own value and by eigen vector we mean its own vector. 
Consider the below Normal faces and Eigen faces. Eigen faces are the primary characteristics of our faces which mostly does not change despite we twist and turn our face.




3. Spectrum & Spectral Radius
Spectrum - All Eigen values of Matrix.
Spectral Radius - Longest Eigen value in the Specturm (set of eigen values).
4. Characteristic matrix - (A-lambda.I)
5. Characteristic determinant - determinant of characteristic matrix
6. Characteristic equation - Characteristic determinant equated to 0.
7. Characteristic Polynomial - Outcome of reducing Characteristic equation
Characteristic matrix is Obtained by stating  
1. say Ax = lambda.x then lambda is scalar real or complex number which just scales x without changing the direction of any of its vectors. 
2. if so, we can also say (A-lambda.I)x = 0
8. Eigen space  - a space formed by eigen vectors.

9. algebraic multiplicity
10. geometric multiplicity
The order Mλ of an eigenvalue λ as a root of the characteristic polynomial is called the algebraic multiplicity of λ.
The number mλ of linearly independent eigenvectors corresponding to λ is called the geometric multiplicity of λ. hence, mλ can be also called as dimension of eigen space.
11. Similar Matrices - Transforming Matrices which does not shifts the angle of transforming objects.


How to get Eigen Values and Eigen Vectors?

Ax = lambda.x
or
(A- lambda.I)x =0  --> It is a homogeneous system of equation. Now we can formulate a polynomial involving lambda and x and get the values of lambda. This polynomial is called as "Characteristic Polynomial"

Determinant of (A- lambda.I) is called as Characteristic Determinant.
Characteristic determinant equated to 0 gives Characteristic equation.
Reducing Characteristic equation we get a Characteristic Polynomial whose solution is our lambda.

Applying any one of lambda to homogeneous system of equation and solving the homogeneous equation by Gaussian elimination or other means, we get its corresponding eigen vectors.


Where does these Eigen Vectors and Eigen values can be applied?
1. To solve Linear Differential Equation.
2. To describe natural frequency of vibration.
3. Distinguish states of Energy.

There were stated in class, but I feel they are important to understand the normal or not changing natural states of differential equation, natural frequencies and natural states of energy.

Non Intuitive but very Important theorems.
1. Eigen vectors cannot be trivial. That is they cannot be zero vector.
2. A Matrix is singular if and only if it has Zero Eigen Values
3. Sum of Eigen Values of the Matrix = Trace of Matrix (sum of diagonal elements of diagonal square matrix)
4. Product of Eigen Values of the Matrix = Determinant of Matrix (product of diagonal elements of diagonal square matrix)
5. The diagonal matrix of upper or lower triangular matrix are the Eigen Values.
6.  A has lambda as eigen value. A inverse has 1/lambda as eigen values.
7. A and transpose of A has same eigen values (of course transpose does not changes the diagonal elements)
8. Eigen values of Real symmetric matrix are real. Eigen values of orthogonal matrix will be either 1 or -1 or can be said as their modules will be 1.


Eigen Space
If w and x are eigenvectors of a matrix A corresponding to the same eigenvalue λ
Then w + x (provided x ≠ −w) and kx for any k ≠ 0 are also eigen vectors. All these eigen vectors form a space called Eigen Space.

That is Eigen values can be repeated but has as many independent eigen vectors any matrix A could have.

What is the impact of having more linearly independent and repeated eigen values and what is the impact of not having more linearly independent for repeated eigen values?


How to search for Eigen Values in large matrices with GERSCHGORIN THEOREM?

If we could draw circles or disc with radius = sum of element value  of row other than diagonal element and with center of disc as the diagonal element value

Then,
1. Range of Eigen values defined by union of all discs.
2. Every eigen value must lie within one the disc.
3. A can be without eigen value.

Based on inequality.
λ – {aii} ≤ Σ |{aij}|, i = 1,2,....n where j≠ i

How to identify "the defect" or linear dependent?

Every polynomial has root whose sum and products make the polynomial equation itself. hence we have to get the algebraic multiplicity and geometric multiplicity.

sum of algebraic multiplicity = degree of polynomial
Mλ - algebraic mulitiplicty
mλ - Geometric multiplicity

In general it can be mλ ≤ Mλ

Δλ = Mλ − mλ is called the defect.

If rank is equal to degree of polynomial, then we don't have defect, if rank is lesser, we will have defect.

If there is no defect mλ = Mλ.

λ comes along diagonal hence the degree is related to algebraic multiplicity.

Rank or no.of linearly independent row vectors of matrix determines the geometric multiplicity.


How to identify a basis from Eigen vectors?

Eigen vectors can act as basis if there are n eigen values for nxn matrix. As Eigen vectors corresponding to distinct eigen vectors are linearly independent and they make the largest span.

How to Diagonalize a matrix with similarity theorem and eigen vectors as basis? 

Similar Matrices with respect to Eigen values or vectors are the matrices, whose eigen values are same but eigen vectors are different

y = P−1x  (x, and y are vector, where P is nxn non singular square matrix)

 is similar Matrix of A if  = P−1AP


In order to get Diagonal Matrix, we have to use similarity theorem  of similar matrices.
D = X−1 A X

here X obtained from eigen vectors of A forming the basis. It is a non singular square matrix.

D is a diagonal matrix which is a similar matrix to A.


How to get Power of matrix A with Diagonalized Matrix?

D^m = X−1A^mX

so A^m = XD^mX−1


Orthogonal, Orthonormal and Normalization.

Two vectors are Orthogonal, if they are perpendicular to each other.

Normalization is done to find unit vector kind of scaled down version with respect to Magnitude i.e norm of vector.

Orthonormal - vectors which is orthogonal and normalized are called as Orthonormal.


How to determine principle axes if A is a Symmetry and an take Quadratic form with eigen values as basis eigen vectors will be orthonormal (orthogonal and unit vector)?

if A is a symmetric Matrix,

Q = x^TAx can be related to eigen vectors in quadratic form, here x is any vector.

We know D = X−1 A X, here X which is the matrix formed from basis with eigen vectors.

and  A = XAX−1

orthogonal means X−1 = X^T

A = XAX^T

so Q = x^T.X.D.X^Tx.

Quadratic form or Canonical form or Principle axis form helps us to determine the direction of principle axis. This is very important for Dimensional Reduction.

No comments:

Post a Comment

Skill, Knowledge and Talent

I kept overwhelming with data, information, knowledge and wisdom over a period of time. And I really wanted to lean towards skilling on few ...