Linear Independence
Some Preliminary Programs
In[1]:=
We just saw in class that the columns of a matrix are linearly independent if and only if the rank(A)=m. In this case m = 9, but we know rank(A)≤6<9 in this example, and so we know the columns of this matrix are linearly dependent.
To find the dependencies we need to compute the kernel of this matrix. We do this by finding the reduced row echelon form of A. Since this is pretty big (and we already have Mathematica running), we'll just let it do the work. We start by inputting the matrix into Mathematica so we can mess around with it.
To find the kernel of A we should augment the matrix with a column of zeros. But this column of zeros isn't going to change when we perform elementary row operations, so we might as well just row reduce the original matrix.
We can now read off the linear relations among the columns of A because we know how to read off the vectors in the kernel of A. Notice that the rows of the reduced row echelon form tell us that a vector is in the kernel of A provided it follows the conditions:
This means a vector is in the kernel of A if and only if we can write
Hence the four vectors on the right hand side of the expression span the kernel ofA.For grins we can check that they are,in fact,in the kernel.We'll just show the second vector is in the kernel.
Great! Just like we knew, this vector is killed by A.
Choose an n x m matrix B and determine if the columns of B are linearly independent.
List vectors which span the kernel of B.
Case 1: n < m
In this case we know that our matrix will have columns that are linearly dependent. Why? If B is the matrix we pick, then we know rank(B)≤n<m, and hence our result from class says the columns of B are linearly dependent. Let's go ahead and verify this by looking at a specific example.
I'll choose a random matrix B.
In[2]:=
Out[2]=
In[3]:=
Out[3]=
Notice that the last column shows that is the only free variable in the system. Hence we see that a vector in the kernel of B takes the form
Case 2: n=m
In this case I can't necessarily say whether the columns are linearly independent without first choosing a matrix (some square matrices have columns that are independent; some don't). Let's pick out an example and compute.
In[4]:=
Out[4]=
To see if the matrix has columns that are linearly independent I will row reduce the matrix. If the row reduced matrix is the identity, I know that the kernel is trivial and the columns are linearly independent. If the row reduced matrix isn't the identity, I'll be able to read off vectors which spans the kernel of B.
Great! Since the rank of B is 5 (which equals the number of columns in B) I know my columns in B are linearly independent. This means that the only linear relation among the columns is the trivial relation.
Case 3: n > m
Just like the last case, I won't know if the columns of a matrix are linearly independent when n>m unless I first see the matrix. So let's choose a random matrix with more rows than columns.
To see if the columns are linearly independent, I simply row reduce it. If the rank of the matrix is equal to the number of columns in the matrix, then I know the columns of B are linearly independent.
Hazaa! The rank is 3 and so the columns are linearly independent. This means the only linear relation among them is the trivial one.