Projection In Linear Algebra

Lakshay SharmaLakshay Sharma
2 min read

Context of this topic:

This topic is a part of linear algebra having applications throughout many different fields like: Computer graphics, Data science and ML(like PCA and feature selection), signal processing and even in Quantum mechanics.

What is a projection?

In Linear Algebra, projection is the operation of mapping a vector onto another vector or subspace such that the resulting vector (the projection) lies within that vector or subspace.

Projection Matrix for a single vector:

Let's say for two given vectors a and b:

$$a \in R^{n}$$

The projection of b onto vector a is given by:

$$proj_{a}b = a\left ( \frac{a^{T}b}{a^{T}a} \right )$$

where

$$\left ( \frac{a^{T}b}{a^{T}a} \right )$$

is the Projection matrix. When this matrix is multiplied by b, it gives a projection of b onto a.

Projection Matrix for a subspace:

Let's say we want to project b onto a subspace W. Let's say u1,u2,....uk are the basis vectors of the subspace W.

Then we take a matrix of all basis vectors of subspace W:

$$U \in R^{n \times k}$$

all the columns of U are u1,u2....uk.

The projection matrix for b onto the subspace is given by

$$P = U\left ( U^{T} U \right )^{-1}U^{T}$$

So the projection of b onto the subspace is:

$$proj_{W}b = Pb$$

Why projection is needed?

So some of the reasons why we need projection:

  1. Solving Ax = b

    Sometimes Ax = b may not be solvable as b may not be in the column space of A, so to solve this equation we find a projection p such that

    Ax = p is solvable.

  2. Dimensionality Reduction: Sometimes working with data with many dimensions can get too complex so it's always better to reduce it to some minimal dimension.

  3. To Find the Least Square fit.

0
Subscribe to my newsletter

Read articles from Lakshay Sharma directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Lakshay Sharma
Lakshay Sharma