Linear Algebra — Enlightening Layman’s view

A clear talk about the four fundamental spaces of it

Bala Manikandan
5 min readOct 5, 2020

We have learned a lot about Vectors and Matrices in our school days, they taught us every possible operation on Matrices, Vector scaling, and a lot of other stuff too. But, one thing that left out was visualizing their geometrical representation to understand what they are capable of, or the real picture of them. Does anyone know Gravity or the Human body can be represented as matrices back then? Probably, shouldn’t blame our system though!

When I jumped onto Deep Learning, I’ve been said that Linear algebra is everything. Then I thought, Oh great! I’ve been good at that in school days, I might be good at this as well. c’mon! who was not good at adding or multiplying two numbers under certain rules! And then, when I get deeper into them I realized there is something more to it.

There is something that can give a whole new perception to the subject that gets interesting only if we know the preliminaries. And, it is about the projection of vectors onto the ’n’ dimensional space so that we understand what is it like, or the consequences of those operations that we do. It can provide meaning to everything in life or even the Ghost if you ask-for it!

First things first! I was quite astonished by the true sense of the subject and wanted to outline some of my learnings just in case if it can be of any help to someone.

Let’s consider, There are two ways to do the product of a matrix,

  • Inner product
  • Outer product

Inner products are the ones that are rows times columns multiplication basically a low-level backend process for computational purpose, and outer products or a vector approach are the ones which are column times rows, makes it a high-level approach and makes understanding easier. For example,

Solution of Ax=b

From the above, we see that both ways give us the same final result. The first method gives us three separate components of Ax and doesn’t imply anything. But, the vector approach(the second one) demonstrates Ax as the linear combination of columns of A. This is fundamental. By this method, we can visualize the addition of two column vectors with magnitude pointing in a specific direction on the given Vector space or Euclidean space. This leaves us in thinking about the concept ‘Fundamental spaces of Linear Algebra’.

Ok! Let me put out some things.

Euclidean Space: The space constituted by the vectors of the given matrix. The dimension is determined by the number of vector components. Here, the dimension for the above example is ℝ³ which is 3D space, and don’t forget to notice it is the same as the number of rows.

Column Space: The space constituted by every column vectors of a given matrix. The idea here is to plot those vectors and all the other combinations of Ax lies on them. Say, in the above example each column vectors is a line in the direction a₁ and a₂ includes all the vectors corresponding to x₁a₁ and x₂a₂ and also the sum of any vector on one line plus any vector on the other line. Basically, it forms an infinite plane ℝ² in the ℝ³ space.

Row Space: Same as above, The space constituted by the every row vectors of a given matrix. The dimension of the Row space is always equal to that of the Column space.

Rank of a Matrix: The dimension of the Row/Column space is called the Rank of a Matrix. When the Rank is equal to the dimension of the entire space, we can say the given matrix can be invertible.

Basis vectors: The linearly independent Column/Row vectors are called the Basis vectors of that space. One way to find these vectors is by using a reduced row echelon form (rref) of a given matrix by Gaussian elimination method. These are vectors that have pivot elements in it or in other words Pivot columns of rref.

For the Column space, the basis vectors are identified by mapping the Pivot columns onto the original matrix. For the row space, the basis vectors are identified from the Pivot rows of rref itself.

Basis: A basis for the given space is the fewest number of linearly independent vectors that it takes to cover the entire space. We can say those linearly independent vectors are the basis vectors and all the other vectors lie inside them.

Nullspace: Space spanned by the vectors x satisfying the condition Ax=0. The Nullspace is orthogonal to Row space. There is one more named Left Nullspace which is nothing but the vectors satisfying the transpose(A)x=0 condition which is orthogonal to Column space.

Nullity: The dimension of the Nullspace spanned by those vectors.

To give you the better picture,

The Four Fundamental Subspaces

So to summarize, why do we need to consider outer products over the other one?

When we do the outer product method we convert whole matrices into a number of Rank 1 matrices that then we can sum up for the final solution, which makes it easy to visualize and these reduced matrices implies the largest piece of a given matrix which is beneficial to Data Science.

The above things form the fundamentals of Linear Algebra that are much useful to understand the Decomposition methods or Matrix factorization which are the underlying concepts of Principle Component Analysis(PCA), Image compression, Denoising data, etc.., And, those are reserved for the future!

Hope I did some justice for your time!

Below are some useful links that you can be benefitted from,

References:

  • All the images are taken from the book, ‘’Linear Algebra and Learning from Data by Gilbert Strang’’

--

--