Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

University

10 Qs

quiz-placeholder

Similar activities

Chapter 3 - Media Asset Management

Chapter 3 - Media Asset Management

12th Grade - University

15 Qs

Nearest Neighbor

Nearest Neighbor

University

15 Qs

KNN algo

KNN algo

University

15 Qs

Intro to ML: The ML Revision Quiz

Intro to ML: The ML Revision Quiz

University

11 Qs

MT_Chapter 5 & 6 (set 2)

MT_Chapter 5 & 6 (set 2)

University

10 Qs

Tema 1 - AA3

Tema 1 - AA3

University

10 Qs

BAN2022_CH4: Classification Part II K-NN (EOC)

BAN2022_CH4: Classification Part II K-NN (EOC)

University

7 Qs

Aprendizaje supervisado: Clasificación

Aprendizaje supervisado: Clasificación

University

10 Qs

Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

Assessment

Quiz

Computers

University

Medium

Created by

Josiah Wang

Used 54+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

In K-NN, is the query time longer than the training time?

Yes

No

Answer explanation

Recall K-NN algorithms are lazy learners!

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

James is trying to decide which machine learning algorithm to use for his project. Which of the following options is true about the k-NN algorithm?

It can only be used for classification

It can only be used for regression

It can be used for both classification and regression

Answer explanation

Regression: considers the VALUE in majority of k neighbours

Classification: considers the CLASS in majority of k neighbours

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Which of the following is true about Manhattan distances?

It can be used for continuous variables

It can be used for categorical variables

It can be used for categorical as well as continuous variables

Answer explanation

Manhattan distance is the L1-norm and therefore cannot be used for categorical variables. Hamming distance would be a good choice for categorical variables.

4.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Poppy is working on a machine learning project using k-NN and has a dataset with a lot of noise. What are the appropriate things to do with k-NN in this situation?

Increase the value of K

Decrease the value of K

K does not depend on the noise

None of these

Answer explanation

Recall when K=1 the decision boundaries perfectly match the data and its noise. This will result in overfitting. Increasing K makes K-NN more resilient to overfitting.

5.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

In K-NN, what is the effect of increasing/decreasing the value of K?

The boundary becomes smoother with increasing values of K

Smoothness of the boundary does not depend on the value of K

The boundary becomes smoother with decreasing values of K

None of these

Answer explanation

Think about what happens when K=1. Increasing K results in K-NN being less effected by noise as a large number of neighbours are considered when making a decision.

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

For embedded applications (i.e. running on a smartphone), what is the most appropriate family of algorithm?

Eager learners

Lazy learners

Answer explanation

Eager learners are desirable as all the heavy computation occurs at training time. Therefore the algorithm is time + compute efficient at inference time.

7.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Given d the distance between a point of the dataset and the query point, which of the following weight functions is appropriate for Distance-Weighted k-NN?

w = exp( -d )

w = log ( min ( 0.25 * d, 1 ) )

w = -d

Answer explanation

Media Image

Recall that the weight you assign is inversely correlated with the distance metric. Both '-d' and 'log(min(0.25*d, 1)' yield negative distances which does not make sense. exp(-d) acts as a good distance metric as it increases exponentially with distance. This favours points close by and quickly ignores points far away.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?