
Data Science and Machine Learning (Theory and Projects) A to Z - Feature Extraction: Kernel PCA
Interactive Video
•
Information Technology (IT), Architecture, Mathematics
•
University
•
Practice Problem
•
Hard
Wayground Content
FREE Resource
Read more
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the relationship between PCA and SVD in the context of a centered matrix X?
PCA is a method to compute the SVD of a matrix.
SVD is used to perform PCA on a centered matrix.
PCA and SVD are unrelated concepts.
SVD is a subset of PCA techniques.
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
How is the dimensionality of matrix Y reduced using SVD?
By multiplying X with a random matrix.
By increasing the number of columns in U.
By selecting the top K singular values.
By decreasing the number of rows in V.
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the purpose of reconstructing the original matrix X using SVD?
To eliminate noise from the data.
To restore the original data from its reduced form.
To verify the accuracy of the dimensionality reduction.
To increase the dimensionality of the data.
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Why is the matrix U avoided in the reconstruction process?
Because U is computationally expensive to use.
Because V and D are sufficient for reconstruction.
Because U does not contain eigenvectors.
Because U is not orthogonal.
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What do the columns of matrix V represent in the context of SVD?
Eigenvectors of X transpose.
Eigenvectors of X transpose X.
Eigenvalues of X transpose.
Eigenvalues of X transpose X.
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the significance of the matrix X transpose X?
It is a similarity matrix for data points.
It is the inverse of the original matrix X.
It is used to compute the eigenvectors of X.
It represents the covariance matrix of X.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
How does the dot product of two vectors relate to their similarity?
The dot product is always zero for similar vectors.
The dot product is unrelated to similarity.
A lower dot product indicates greater similarity.
A higher dot product indicates greater similarity.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?