Few days before I had small issues in understanding the internal workings of SVD. The Worst part is I have already done few projects which has SVD as a major module. This is the time I decided to revisit and recall every thing I need to know about decomposition techniques (Well, this is also he time I realized why missed to document about SVD so long ago).

Today there is a lot of decomposition technique available to solve the statistical problems, however most popular and widely used decomposition technique are.

- Eigenvalue decomposition
- Singular value decomposition (SVD)
- Principal component analysis (PCA)

And this post is dedicated for laymen who like to understand the internal workings of Singular value decomposition.

The word decomposition means reduction or breaking down, as the name indicates SVD is one the best factorization (decomposition) technique available for the matrix.

In simpler terms factorization help to break the bigger matrix into small entities such that when multiplying the entities we get the original matrix.

Example, a matrix A_{MxN }can be broken down and written in a form A = UDV^{t}. Such that when multiplying U * D * v^{t }we get the same matrix A.

In this representation A = UDV^{t}, U and Vare orthogonal matrix and D is diagonal matrix whose diagonal entries are (d1, d2 …. dn) eigenvalues of the original matrix A.

**Proof**

Theatrically to believe A = UDV^{t }we need an accurate publishable proof.

To prove A = UDV^{t}.Let say A=UDV^{T }and apply transpose on both side equation to get A^{t} = U^{t}DV.

Now we have two equations

A = UDV^{t}

A^{t }= VDU^{t} (For the sake of convince I have rearranged U^{t}DV into VDU^{t})

Having these equations in hand, if we prove the result of AA^{t }is exactly equal to U^{t}DV * VDU^{t }then our A = UDV^{t }assumption is correct and holds good.

A^{t}A = VDU^{t}UDV^{t}

A^{t}A = VD^{2}V^{t}

A^{t}AV = VD^{2}

^{Algorithm}

To compute SVD find eigenvalues and eigenvectors of AA^{t} and A^{t}A.

Compute U : Find eigenvalues of matrix AA^{t}, from eigenvalues compute eigenvector which is U.

Compute V^{t : }Find eigenvalue of matrix A^{t}A, from eigenvalues compute eigenvectors which is V^{t}.

Compute D : Compute the square root of the common positive eigenvalues of AA^{t} and A^{t}A.

Assign computed values to U, V, and D.

Let try with an example. Let A be matrix.

Using eigenvalues from AA^{t} U matrix is produced. ^{
}

Using eigenvalues from A^{t}A V matrix is produced.

S is diagonal matrix produced from common positive eigenvalues of AA^{t} and A^{t}A

Continue with part 2 – Low-rank-matrix-approximation-svd

**Reference **

http://en.wikipedia.org/wiki/Singular_value_decomposition

http://web.mit.edu/be.400/www/SVD/Singular_Value_Decomposition.htm

## Post a Comment