Clear Filters
Clear Filters

Using SVD for Dimensionality Reduction

27 views (last 30 days)
Hello everyone.
I have a matrix that has 300 rows(samples) and 5000 columns(features).
I need to reduce the number of columns for classification.
As far as I know for using pca() function the number of samples should be greater than the number of features.
So I try to use Singular Value Decomposition function with below codes.
%Singular value decomposition of X;
[U, Sig, V]=svd(X);
sv=diag(Sig)
%for the distribution of singular values;
figure;
sv=sv/sum(sv);
stairs(cumsum(sv));
xlabel('singular values');
ylabel('cumulative sum');
I have two questions.
1) As i understand from the above figure i have to take approximately 250 singular values that it counts for 95% of my data.
So should I take first 250 singular values for creating a new data for classification?
How can i see the variance of each principal components like in pca() functions explained matrix to decide how many of them should i use?
2) After defining the number of principal components, I need to create a new matrix for classification.
Can I do this with below code? (for example with first two principal components)
new_matrix_for_classification = X*(V:,1:2);
Thanks in advance.

Accepted Answer

Mahesh Taparia
Mahesh Taparia on 2 Apr 2021
Hi
The first part is already answered here.
For 2nd part, you can use the function pca to directly calculate the input with principal components. For example, in your case if you want 1st 2 components, then:
[coeff,score,latent] = pca(X);
new_matrix_for_classification = score(:,1:2); %score is representation in new space
Hope it will help!

More Answers (0)

Categories

Find more on Dimensionality Reduction and Feature Extraction in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!