Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

University

10 Qs

quiz-placeholder

Similar activities

ER MODEL

ER MODEL

University

15 Qs

Kuis Dadakan ;)

Kuis Dadakan ;)

10th Grade - University

15 Qs

CHAPTER 1: COMPUTER SECURITY REVIEW

CHAPTER 1: COMPUTER SECURITY REVIEW

University

10 Qs

Basic on Operating System

Basic on Operating System

University

10 Qs

Evaluasi Pertemuan 12 DRPL TI-3B

Evaluasi Pertemuan 12 DRPL TI-3B

University

15 Qs

Pythonintro

Pythonintro

University

15 Qs

Pop Quiz- Abstraction, Algorithm

Pop Quiz- Abstraction, Algorithm

University

14 Qs

AWS ACF Módulo 2 - Economia e Faturamento na Nuvem

AWS ACF Módulo 2 - Economia e Faturamento na Nuvem

University

15 Qs

Intro to ML: K-Nearest Neighbours 2024

Intro to ML: K-Nearest Neighbours 2024

Assessment

Quiz

Computers

University

Medium

Created by

Josiah Wang

Used 57+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

In K-NN, is the query time longer than the training time?

Yes

No

Answer explanation

Recall K-NN algorithms are lazy learners!

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

James is trying to decide which machine learning algorithm to use for his project. Which of the following options is true about the k-NN algorithm?

It can only be used for classification

It can only be used for regression

It can be used for both classification and regression

Answer explanation

Regression: considers the VALUE in majority of k neighbours

Classification: considers the CLASS in majority of k neighbours

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Which of the following is true about Manhattan distances?

It can be used for continuous variables

It can be used for categorical variables

It can be used for categorical as well as continuous variables

Answer explanation

Manhattan distance is the L1-norm and therefore cannot be used for categorical variables. Hamming distance would be a good choice for categorical variables.

4.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Poppy is working on a machine learning project using k-NN and has a dataset with a lot of noise. What are the appropriate things to do with k-NN in this situation?

Increase the value of K

Decrease the value of K

K does not depend on the noise

None of these

Answer explanation

Recall when K=1 the decision boundaries perfectly match the data and its noise. This will result in overfitting. Increasing K makes K-NN more resilient to overfitting.

5.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

In K-NN, what is the effect of increasing/decreasing the value of K?

The boundary becomes smoother with increasing values of K

Smoothness of the boundary does not depend on the value of K

The boundary becomes smoother with decreasing values of K

None of these

Answer explanation

Think about what happens when K=1. Increasing K results in K-NN being less effected by noise as a large number of neighbours are considered when making a decision.

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

For embedded applications (i.e. running on a smartphone), what is the most appropriate family of algorithm?

Eager learners

Lazy learners

Answer explanation

Eager learners are desirable as all the heavy computation occurs at training time. Therefore the algorithm is time + compute efficient at inference time.

7.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Given d the distance between a point of the dataset and the query point, which of the following weight functions is appropriate for Distance-Weighted k-NN?

w = exp( -d )

w = log ( min ( 0.25 * d, 1 ) )

w = -d

Answer explanation

Media Image

Recall that the weight you assign is inversely correlated with the distance metric. Both '-d' and 'log(min(0.25*d, 1)' yield negative distances which does not make sense. exp(-d) acts as a good distance metric as it increases exponentially with distance. This favours points close by and quickly ignores points far away.

Create a free account and access millions of resources

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

By signing up, you agree to our Terms of Service & Privacy Policy

Already have an account?