Showing posts with label Linear Algebra. Show all posts
Showing posts with label Linear Algebra. Show all posts

Sunday, 31 May 2020

Special Vector Spaces in Linear Algebra

All Vector spaces have Basis (Linearly Independent Vectors) and Which Spans the Other Vectors in the Vector Space.

1. Vector Space.
2. Sub Space.

Matrix Spaces
1. Column Space (RREF)
2. Row Space (REF)
3. Null or Solution Space (RREF without and with free variable)

Eigen Space with Eigen Vectors.

Inner Product Space - A vector space with inner product is called Inner Product Space. (set of scalars is set of real numbers)

Usefulness: For Gram Schmidt Orthogonality. QR Decomposition.

Wednesday, 27 May 2020

Introduction to Special Matrices

My journey with Special Matrices started with Transpose Operation, you can read about it in this post as you could find few interesting ones already over there.

There are so many number of special matrices(named matrices), but we shall see few of them.

In the previous blog, we saw that Unitary Matrix U has property UU = UU  = I. I being Identity. 
In the previous blog, we saw that Orthogonal Matrix has property ATA = AAT = I, I being Identity. 


I found that few matrices are named as Normal Matrices. When A*A = AA* but need not be equal to I (identity matrix). Among complex matrices, all Unitary, Hermitian, and skew-Hermitian matrices are normal. Likewise, among real matrices, all orthogonal (ATA = AAT = I), symmetric (ATA = AAT), and skew-symmetric (ATA = AATmatrices are normal. However, it is not the case that all normal matrices are either unitary or (skew-)Hermitian.

I found Symplectic matrix interesting as i found it to be related to Transpose and Skew-Symmetric Matrix. It is a Dual Combo.

Symplectic matrix is a 2n × 2n matrix M with real entries that satisfies the condition
{\displaystyle M^{\text{T}}\Omega M=\Omega ,}
where MT denotes the transpose of M and Ω is a fixed 2n × 2n nonsingularskew-symmetric matrix

The below picture from wikipaedia, provides few named matrices and their relations. The wikipaedia page provides almost all present named matrices despite no relationship been identified. I have not gone through other interesting ones, may be they are for another blog.

Transpose Properties and Special Matrices

Transpose it very simple operation to change rows into columns and columns into rows. I had never given any importance to this operation until i found the relationship between Transpose Operation with Special Matrices. For Example we prefer Transpose of Matrix A over Inverse of Matrix A when we find the matrix is Orthogonal.

Orthogonal Matrix is a special matrix, I thought it is the only case. The real surprise came in, when dealing with Real Skew Symmetric Matrix.

In this blog I like to capture the properties of Transpose Operation of Matrix and the relation of Transpose operation with Special Matrices.

Properties of Transpose Operation: 
Let us see the properties

1. Inverse of Transpose: Transpose of Transpose
{\displaystyle \left(\mathbf {A} ^{\operatorname {T} }\right)^{\operatorname {T} }=\mathbf {A} .}

 2. Transpose with Additivity
{\displaystyle \left(\mathbf {A} +\mathbf {B} \right)^{\operatorname {T} }=\mathbf {A} ^{\operatorname {T} }+\mathbf {B} ^{\operatorname {T} }.}

3. Transpose with Mulitiplicativity
{\displaystyle \left(\mathbf {AB} \right)^{\operatorname {T} }=\mathbf {B} ^{\operatorname {T} }\mathbf {A} ^{\operatorname {T} }.}

4. Transpose with Scalar Multiplication
{\displaystyle \left(c\mathbf {A} \right)^{\operatorname {T} }=c\mathbf {A} ^{\operatorname {T} }.}

5. Transpose with Determinant
{\displaystyle \det \left(\mathbf {A} ^{\operatorname {T} }\right)=\det(\mathbf {A} ).}

6. Share Same Eigen Values & Polynomial: 
Aand share same eigen values as they polynomial representation is same.

7. Dot Product with Transpose
{\displaystyle \left[\mathbf {a} \cdot \mathbf {b} \right]=\mathbf {a} ^{\operatorname {T} }\mathbf {b} ,}

8. Transpose exchange with Inverse
{\displaystyle \left(\mathbf {A} ^{\operatorname {T} }\right)^{-1}=\left(\mathbf {A} ^{-1}\right)^{\operatorname {T} }.}

9. If A has all Real entries then
ATA is a positive-semidefinite matrix. Useful for defect and condition calculation to reduce error.

10. {\displaystyle \left(\mathbf {A} \mathbf {A} ^{\operatorname {T} }\right)^{\operatorname {T} }=\left(\mathbf {A} ^{\operatorname {T} }\right)^{\operatorname {T} }\mathbf {A} ^{\operatorname {T} }=\mathbf {A} \mathbf {A} ^{\operatorname {T} }.}
 

We have seen 10 properties of Transpose Operation. Now let us see What it has to do with Special Matrices. We shall start with simple special to very special and we shall see what special property arises with it.

If A is Symmetric Matrix
{\displaystyle \mathbf {A} ^{\operatorname {T} }=\mathbf {A} .} 
Only if A = Awe call a matrix Symmetric.


If A is Orthogonal Matrix, We call a matrix A orthogonal ATA = AAT = I
{\displaystyle \mathbf {A} ^{\operatorname {T} }=\mathbf {A} ^{-1}.}


{\displaystyle \mathbf {A} ^{\operatorname {T} }=-\mathbf {A} .}
Only if  A= - we call a matrix Skew Symmetric.


If A is Hermitian Matrix, We call a matrix H hermitian if H = Conjugate Transpose of H.
{\displaystyle \mathbf {A} ^{\operatorname {T} }={\overline {\mathbf {A} }}.}
when we take transpose of hermitian we get A= conjugate of A


{\displaystyle \mathbf {A} ^{\operatorname {T} }=-{\overline {\mathbf {A} }}.}
This is a combination of both Skewed Matix and Hermitian.

If A is Unitary Matrix, We call a matrix U unitary if U−1 = U∗ or UU = UU  = I. Here * denotes conjugate transpose.
{\displaystyle \mathbf {A} ^{\operatorname {T} }={\overline {\mathbf {A} ^{-1}}}.}

For a unitary matrix U, determinant of U is 1. Hence it is called Unitary Matrix.

In the above cases we can see how various complex operation can be replaced with Transpose Operation, It is very interesting to know the application of Transpose :)

You can also read more about Special Matrices my other post.


Friday, 8 May 2020

Linear Algebra, Real world modeling and Vice Versa

I started out with this blog with a question on discussion forum in my M.Tech class

Could anyone help me understand the relation between Linear Algebra related to real world? Or vice-versa when I see a real world problem how do relate it to Linear Algebra?


Below will be your take away
1. Importance of Linear systems in modeling Non linear systems
2. How to see real world via the glass of linear algebra and mathematics
3. Little bit of my own experience

There are numerous applications for Linear Algebra.

Here are the top 3 links I got from google for "Applications of linear algebra" Search.

Analytics Vidya - A wonderful platform for learning data science, I have attended one free workshop conducted by them.
UCdavis university  - looks to be a wonderful university.
Jeremy kun - A real math and programming geek, I do follow him on twitter and also read his blogs when time permits.

Search for "Real world applications of linear algebra" turned out to be horrific.

It is pity that none addressed with a Human Eye for layman and so I like to address it with my experience.

I find Mathematical modelling to be a profound application of linear algebra. And a DOT, I stop here, below are justification and my experience with examples outside and from my own life.

Humanity has always tried to model nature and real world. Consider throwing a stone taking a parabolic curve and study of curvilinear motions. These findings, today are helping us to launch rockets. If a person throws a stone on a still pond of water, how can we model the water waves? We can consider modelling this with bessels equation.  It is a second order differential equation to models waves of a drum, it has found some application like designing speaker and headphone designs. Such mathematical equation looks daunting, consider a slight change in real time scenario, where there are few frogs and fishes in the ponds or there is a change in turbidness of water that could change the dynamics of waves in the water. Do you think with these changes we could still use bessels equation as a model? No, we cannot, a slight change in initial condition makes the model wrong for the real time scenario.

I happen to work with a PHD thesis paper for my B.Tech project which was dealing with Non linear schodinger equation (NLSE)I did not know by then I was dealing with an outcome of Linear Algebra & Numerical Analysis, I worked on a Predictor-Corrector method which model NLSE. The NLSE equation is so much complex one even for physicist, it actually models an electron motion as wave motion moving around nucleus. I was dealing with it to model soliton waves (in simple terms Tsunami waves) which could travel long distances in fiber optic cables without much repeaters to boost the signals (say a trans Atlantic transmission). Our connection loss will also be minimal over internet and will lead to higher bandwidth. The idea of the project was to study more, I also was able to make atomic bomb explosion wave patterns with that equation. To be honest I was not a geek, nerd or math wiz, I had the program, I made some initial change after going through an entire book and lot of scientific research papers about such waves and was able to reproduce the scenarios.

The NLSE equation was so complex, many people simplified the model to program in computer. It is a Nonlinear equation. The simplification is linear model and mostly error prone. The PHD thesis provided a way to improve this linear model with another linear model based on Predictor-Corrector method.

Many a time we end up modeling reality with Non Linear model as such as Bessels and NLSE.  Non linear models may remain good and fit for analysis but when we put them on computers they do have limitations. so it becomes important to convert any kind of non linear model into linear model.

I pursued B.Tech on Electronic and Communication Engineering, even I had a complete paper on Circuit theory to model Transistors as linear combination of linear circuitry parts like resistors, capacitors and inductors without which we will never be able to mathematically analyse what should be input voltage or current to produce output voltage. It is easy to analyse for resistors, capacitors and inductors as they are linear components, we have linear equations.

Consider modelling Humour with mathematics saying, this much of pause while speaking, this much of double meaning or this much of quacky things will make a joke. We collect lot of data and try to bring out a mathematical model which will work. Here is where linear algebra helps. Now  suddenly do you see, We consider system as a black box with a mathematical model, we know only inputs and know the outputs, we like to know what the black box is. I hope now you can relate much better to Linear Algebra. Despite the fact we know Humour is hard to define in mathematical models, we find it some how easy and simple to state them as a linear combination of variable we have taken (pause, double meaning and quacky things), our model may predict some outcome, in reality it can work or it may not. if it does not we could add more parameter try out few more experiments, add more data.

I consider the below picture for linear, super linear and sub linear. Here both super and sub-linear are Non-Linear.





A Typical equation for linear is AX = B.  Linear equation can be easily combined and factored. It is a polynomial equation of degree 1. Consider quadratic equations, cubic equations, they are examples of Non linear equations. quadratic has one point of inflexion (i.e., like a U turn pattern), cubic has 2 points of inflexion. quartic has 3 points of inflexion. Quintic has 4 points of inflexion .

From Galois theory we now known that there is no formula to solve a general quintic equation above degree polynomials.  Post this theory Modern Algebra or Abstract Algebra started to take full swing. Today we are speaking about Fields, Vector spaces in a more abstract manner.

Please check out this link for some discussions on Quntic equations.
https://math.stackexchange.com/questions/1635950/is-there-a-formula-for-the-roots-of-a-quintic-equation





We require solutions as outcome values when we do models, if we cannot figure out a formula or equation, we will not be able to extract a solution or do some predictions. That is the problem with modelling with polynomial of higher degrees.

Let us look around our reality and now and ask, how should i model reality mathematically. Should I go for Nonlinear or Linear or a combination of both.

One can also check out the difference between Linear Algebra and Numerical Analysis, they are closely related. Numerical Analysis requires the knowledge of Linear Algebra. If we like to program model on computers we are constrained with lot of limitations and will be forced to use magic numbers for higher precision like we use numbers for pi, e etc., That is where Numerical Analysis and Approximations helps.

Biography of George Dantzig

Do you think by solving Homework problem, you can gain a PHD? Here is professor George Dantzig, he took two problems which was never meant to solve as homework and solved it, his professor called him and informed that he had solved 2 unsolved problem, they have so far not solved. Eventually he also got his PHD with his proof.

Why should I speak about Sir Dantzig? because, he is also the person who develop Simplex Algorithm and pioneered development of Linear Programming along with others. Linear programming grew out of attempts to solve systems of linear inequalities, allowing one to optimize linear functions subject to constraints expressed as inequalities. It deals with handling Optimization Problems and Optimization of mathematical models.

Apart from Linear Programming which is dealing a subset of modeling and optimization, we have Constraint Satisfaction Problems (just like sudoku puzzle), John von Neumann & John Nash Game Theories for modeling and Prospect Theories for modeling realities.

Almost other theories & methods are specializations whose applications demands the understanding of Linear Algebra. Hope you are prepared now to deal with it with larger enthusiasm.

Skill, Knowledge and Talent

I kept overwhelming with data, information, knowledge and wisdom over a period of time. And I really wanted to lean towards skilling on few ...