How to Use Singular Value Decomposition (SVD) In machine Learning
How many times you heard singular value decomposition and skipped the article because of the complex mathematical terms?
This time you won’t.
Singular Value Decomposition, or SVD for short, is a mathematical technique used in machine learning to make sense of huge and complicated data.
Let me explain SVD in laymen's terms:
Imagine you have many different toys you want to organize them. Some are big, some are small, some are red, some are blue, and so on. It can take some work to figure out how to group them together!
But what if you could break down each toy into its most essential parts?
For example, you could take a big red ball and break it down into a big part, a red part, and a round part.
That would make it much easier to compare and group together with other toys that have those same basic parts.
That's what SVD does! 😉
HOW TO USE SINGULAR VALUE DECOMPOSITION (SVD) IN MACHINE LEARNING
It takes a big, complicated piece of data and breaks it into its most essential parts. Then we can use those parts to find patterns and similarities in the data.
For example:
let's say you have many pictures of different animals. SVD could break down each picture into its most essential parts, like lines and curves. Then we could use those parts to find patterns, like which animals have similar shapes.
Before going forward, let's see the article table of contents.
How to Use SVD in Machine Learning
Here are a few examples of the types of problems that SVD can help solve:
Dimensionality Reduction
One of the main applications of SVD is to reduce the dimensionality of a dataset. By finding the basic patterns in the data and discarding the less important ones, SVD can help simplify the data and make it easier to work with.
Data Compression
SVD can also compress large datasets without losing too much information. We can represent the data using fewer features by keeping only the most important singular values and associated singular vectors.
Matrix Approximation
Another application of SVD is to approximate a large, complex matrix using a smaller, simpler one. This can be useful when working with large datasets that are difficult to handle directly.
Collaborative Filtering
SVD can be used to predict user preferences in recommender systems by modeling the relationships between users and items in a large matrix.
So why there is need for SVD in the first place?
Well, sometimes data can be massive and complicated, and it's hard to make sense of it all. SVD helps us simplify the data and find the most essential parts to understand it better.
What is Singular Decomposition Value
Singular Value Decomposition is a way to factor a matrix A into three matrices, as follows:
A = U * S * V^T
Where U and V are orthogonal matrices, and S is a diagonal matrix containing the singular values of A.
Note:
- The matrix is considered an orthogonal matrix if the product of a matrix and its transpose gives an identity value.
- A matrix is diagonal if it has non-zero elements only in the diagonal, running from the upper left to the lower right corner of the matrix.
Here, U and V represent the left and right singular vectors of A, respectively, and S represents the singular values of A.
The algorithm for computing the SVD of matrix A can be summarized in the following steps:
- Compute the eigendecomposition of the symmetric matrix A^T A. This can be done using any standard eigendecomposition algorithm.
- Compute the singular values of A as the square root of the eigenvalues of A^T A. Sort the singular values in descending order.
- Compute the left and right singular vectors of A as follows:
- For each singular value, find the corresponding eigenvector of A^T A.
- Normalize each eigenvector to have a unit length.
- The left singular vectors of A are the eigenvectors of A A^T corresponding to the nonzero singular values of A.
- The right singular vectors of A are the normalized eigenvectors of A^T A.
Assemble the SVD of A as follows:
- The diagonal entries of S are the singular values of A, sorted in descending order.
- The columns of U are the corresponding left singular vectors of A.
- The columns of V are the corresponding right singular vectors of A.
Once the SVD of matrix A has been computed, it can be used for various tasks in machine learning, such as
- Dimensionality reduction,
- Data compression,
- Feature extraction.
What is the Difference Between SVD and EVD?
Singular Value Decomposition (SVD) and Eigenvalue Decomposition (EVD) are important matrix factorization techniques with many applications in machine learning and other fields.
While they share some similarities, there are also some important differences between them.
Eigenvalue Decomposition (EVD) factorizes a square matrix A into three matrices:
A = V * Λ * V^-1
where V is a matrix whose columns are the eigenvectors of A, Λ is a diagonal matrix whose entries are the corresponding eigenvalues of A, and V^-1 is the inverse of V.
Singular Value Decomposition (SVD), on the other hand, factorizes any m x n matrix A into three matrices:
A = U * Σ * V^T
where U is an m x m orthogonal matrix, Σ is an m x n diagonal matrix whose diagonal entries are the singular values of A, and V is an n x n orthogonal matrix.
One key difference between SVD and EVD is that SVD can be applied to any matrix, while EVD is only defined for square matrices.
Another key difference is that the matrices U and V in SVD are not necessarily A's eigenvectors but A's left and right singular vectors, respectively.
What is the Difference Between SVD and Truncated SVD?
The main difference between SVD and Truncated SVD is that SVD factorizes a matrix into three matrices. At the same time, Truncated SVD is a variation of SVD that keeps only a subset of the singular values and associated singular vectors, leading to a lower-rank approximation of the original matrix.
In other words, while SVD provides a complete decomposition of a matrix, Truncated SVD approximates the matrix by keeping only the most important information. Which can be helpful in situations where the original matrix is too large or complex to work with directly.
Here are the main differences between SVD and Truncated SVD:
- SVD produces the exact decomposition of a matrix into singular values and singular vectors, while Truncated SVD produces a lower-rank approximation that discards the less important information.
- Truncated SVD can be faster and more memory-efficient than SVD since it only keeps a subset of the singular values and associated singular vectors.
- Truncated SVD can be used for dimensionality reduction and data compression, while SVD is more commonly used for solving linear systems and other applications that require the exact decomposition.
Which One is Better SVD or PCA
Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) are both popular techniques for dimensionality reduction and feature extraction in machine learning and data analysis.
While they share some similarities, there are also some important differences between them.
PCA is a linear transformation technique that aims to find a new set of orthogonal variables, called principal components, that capture the maximum amount of variance in the data.
PCA is typically used for unsupervised learning and can be used to reduce the dimensionality of a dataset by projecting it onto a lower-dimensional space.
SVD, on the other hand, is a more general matrix factorization technique that can be applied to any matrix, not just covariance matrices.
SVD decomposes a matrix into three matrices: U, Σ, and V,
where
- U and V are orthogonal matrices,
- Σ is a diagonal matrix of singular values.
SVD can also be used for dimensionality reduction and feature extraction by keeping only the most important singular values and associated singular vectors.
Here are some differences between SVD and PCA:
- Type of Data: PCA is typically used for covariance matrices of numerical data, while SVD can be applied to any matrix.
- Goal: PCA aims to find a new set of orthogonal variables that capture the maximum amount of variance in the data, while SVD aims to find a decomposition of a matrix into singular values and associated singular vectors.
- Use of the Data: PCA is an unsupervised learning technique that can be used to reduce the dimensionality of a dataset, while SVD can be used for a variety of tasks, including matrix approximation, collaborative filtering, and image compression.
Regarding which technique is better, it depends on the specific problem and data being analyzed. Both SVD and PCA have their strengths and weaknesses, and the best approach will depend on the specific context of the problem.
In general, PCA is a more specialized technique that is particularly useful for the dimensionality reduction of numerical data. At the same time, SVD is a more general technique that can be applied to a wider range of data types and problems.
Singular Value Decomposition Implementation In Python
For this example, let's use the famous "Iris" dataset, a set of measurements for different species of iris flowers.
Here's a link to download the dataset: https://archive.ics.uci.edu/ml/datasets/iris
Now Let's see how it look like, using the below code we are create the visuvalization for that.
Conclusion
In this article you have learned about SVD using straightforward examples. The article also elucidates what different problems SVD solves. It explains the contrast between SVD & ECD with mathematical expression.
Then it touches on the difference between Truncated-SVD & SVD and lastly theoretically explains how to choose between PCA and SVD. The article winds up by implementing SVD with python and a sample visualization.
Recommended Courses
Machine Learning Course
Rating: 4.5/5
Deep Learning Course
Rating: 4.5/5
NLP Course
Rating: 4/5