SSL Practice

SSL Practice

University

42 Qs

quiz-placeholder

Similar activities

EVALUACION PX HOSPITALIZADOS 1ER PARCIAL

EVALUACION PX HOSPITALIZADOS 1ER PARCIAL

University

41 Qs

Digestive System

Digestive System

8th Grade - University

42 Qs

Technology

Technology

University

45 Qs

Endocrinologia - The MASK

Endocrinologia - The MASK

University

40 Qs

7OFE

7OFE

University

39 Qs

Potenciales y sinapsis

Potenciales y sinapsis

University

40 Qs

Electricity and Magnet Review

Electricity and Magnet Review

6th Grade - University

42 Qs

PCT Prope - Control 2.1 Dimensiones, forma, estructura terrestre

PCT Prope - Control 2.1 Dimensiones, forma, estructura terrestre

University

40 Qs

SSL Practice

SSL Practice

Assessment

Quiz

Science

University

Practice Problem

Medium

Created by

Stefan Nastase

Used 49+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

42 questions

Show all answers

1.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Which are the main differences when applying maximum likelihood for a single Gaussian versus a mixture of Gaussians?

Overfitting can only happen for the single Gaussian model

The mixture model is prone to singularities

ML treatment for the mixture model can be solved in closed form, leading to a simple solution

The mixture model may provide a better fit, depending on the dataset

Answer explanation

  • The mixture of Gaussians model is prone to singularities, especially when one of the Gaussian components collapses to a single point in the dataset, leading to infinitely high likelihoods. This typically requires regularization or constraints to handle.

A mixture of Gaussians can provide a better fit to the data compared to a single Gaussian, especially when the data comes from a distribution that is multi-modal or has a complex structure that a single Gaussian cannot capture.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Regularized least squares for a linear regression model is similar to a Bayesian treatment for regression when:

The prior for the weights is a Dirichlet distribution

The prior for the weights is any type of Gaussian

The prior for the weights is a simple zero mean isotropic Gaussian

There is no link between regularized regression models and Bayesian models for regression

3.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Which of the following option(s) is / are true?

You need to initialize parameters in PCA

You don’t need to initialize parameters in PCA

PCA can be trapped into local minima problem

PCA can’t be trapped into local minima problem

Answer explanation

  • Initialization of Parameters: In PCA, there is no need to initialize parameters because the process involves computing the eigenvalues and eigenvectors of the covariance matrix of the data. This computation is deterministic and does not rely on starting values.

  • Local Minima: PCA is not an optimization problem in the sense of having a cost function that needs to be minimized with respect to parameters that require iterative updates. Instead, it involves solving an eigenvalue problem, which has a closed-form solution. Therefore, PCA does not suffer from local minima issues.

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Regarding bias and variance, which of the following statements are true?

Models which overfit have a high bias.

Models which overfit have a low bias.

Models which underfit have a high variance.

Models which underfit have a low variance

Answer explanation

  • Overfitting and Bias: Overfitting occurs when a model captures not only the underlying patterns in the data but also the noise. Such models are highly flexible and can fit the training data very well, leading to low bias (small error due to incorrect assumptions about the data).

  • Underfitting and Variance: Underfitting happens when a model is too simple to capture the underlying patterns in the data. Such models perform poorly on the training data and generalize badly to unseen data, resulting in high bias but low variance (the model's predictions do not change much with different training sets because it is consistently poor).

5.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Which of the following is true about generative models?

They capture the joint probability

The perceptron is a generative model

Generative models can be used for classification

They capture the conditional probability

Answer explanation

Perceptron: The perceptron is not a generative model. It is a discriminative model, which directly estimates the conditional probability P(Y∣X) or a decision boundary between classes.

  • Classification: Generative models can indeed be used for classification. By modeling the joint distribution P(X,Y), they can compute the conditional probability P(Y∣X) using Bayes' theorem and make predictions.

  • Conditional Probability: While generative models can provide conditional probabilities through Bayes' theorem, their primary characteristic is capturing the joint probability, not just the conditional probability directly.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Compared to the variance of the Maximum Likelihood Estimate (MLE), the variance of the Maximum A

Posteriori (MAP) estimate is:

higher

same

lower

it could be any of the above

7.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

How can we demonstrate that a function is a valid

kernel?

We need to be able to express it as a dot product in a

latent feature space, maybe with an infinite number of

dimensions

We need to be able to express it as a dot product in a

latent feature space, but with a finite number of

dimensions

We show that the Gram matrix contains only positive

elements

We show that the design matrix is positive

semi-definite

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?

Discover more resources for Science