SEARCH THIS BLOG :-)

Thursday, 9 June 2011

Proofs with Linearly independent vectors, Invertible matrices,


Your Problem
Suppose that  u1, u2, . . . , ut are vectors in $C^n$ (complex number) which are linearly independent.
    (a) Also suppose that “M” is an n x n   matrix that is invertible.  Show that  Mu1, Mu2, . . . , Mut  are linearly independent vectors.

(b) This isn’t true when “M” is not invertible.  Find a counterexample, as follows.  Give an example of a 2 x 2 matrix “M” and two independent vectors  u1, u2  Є  $C^2$ so that Mu1 and Mu2  are dependent.

My Solution (a)
The concept in part (a) is quite easy, it is just a little technical to write it down correctly.
(a)
The vectors u1, u2, . . . , u are linearly independent vectors if and only if the vector equation
a1u1 + a2u2 + . . . + at ut = 0
has only the trivial solution, that is  
a1 = a2 = … = at = 0 .

Now consider the equation,   

a1 (Mu1) + a2 (Mu2)  + . . . + at (Mut) =0                (1)

We need to show the only solution for this is 
a1 = a2 = … = at = 0.

This will show that the vectors Mu1, Mu2, . . . , Mut  are linearly independent vectors.

Firstly, it is true that      a1 (Mu1) = M (a1 u1)       
  (see appendix result 1)

Then the equation (1)  becomes

M (a1 u1) + M (a2 u2) + … + M (at ut) = 0                          (2)

Then  factorizing we have
M ( (a1 u1) +  (a2 u2) + … + (at ut) ) = 0                     (3)

(see appendix for proof of this)

Now since M is invertible it means that the inverse exists, so we can multiply both side by M-1. The right side is just zero again, and M-1 x M = I (the identity matrix)

M-1 x M ( (a1 u1) + (a2 u2) + … + (at ut) ) = M-1 x 0
Therefore,
 ( (a1 u1) +  (a2 u2) + … + (at ut) ) = 0
Dropping the brackets as we do not need them anymore, they are just grouping symbols,

a1 u1 + a2 u2 + … + at ut  = 0

But since the vectors u1, u2, . . . , u are linearly independent vectors by assumption then the only solution to this equation is

a1 = a2 = … = at = 0

So we are done since we have shown that the only solution to the equation (1) is the trivial one, and hence the vectors
Mu1, Mu2, . . . , Mut  are linearly independent vectors as required.
===================================

My Solution (b)  



Appendix:

Result 1:
If a is a number and M = (mij) is an nxn matrix and u = (uj) is an n-vector then we show my the basic definitions of multiplication between scalars, vectors and numbers that
a (Mu)  =  M (a u)                                               (*)

The ith entry of the Left Hand Side vector
=  a (∑nj=1  m­­­­ij uj)
=  ∑nj=1  a m­­­­ij uj
=  ∑nj=1  m­­­­ij  (auj ) = right hand side

but this is the ith  entry of the right hand side, and hence we have proven (*)
========================================

Result 2                         

If M = (mij) is an nxn matrix and u = (uj)  v = (vj)  are n-vectors then we show my the basic definitions  vectors and matrix operations that
M(u + v)  =  Mu + Mv                                (**)

           The ith entry of the Left Hand Side vector in (**)
= nj=1  m­­­­ij  (uj + vj )
= ∑nj=1  (m­­­­ij uj + m­­­­ij vj )       (by basic rules of complex numbers multiplication)
          = ∑nj=1  m­­­­ij uj + ∑nj=1  m­­­­ij vj
          = the ith entry of Mu + the ith entry of Mv
          = the ith entry right hand side in (**)
and so we are done and (**) is proven.

==================================



No comments:

Post a Comment