Search Header Logo

Machine Learning (Module 2)

Authored by Abhishek Verma

Computers

University

Used 24+ times

Machine Learning (Module 2)
AI

AI Actions

Add similar questions

Adjust reading levels

Convert to real-world scenario

Translate activity

More...

    Content View

    Student View

50 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following algorithm is not an example of an ensemble method?

A. Extra Tree Regressor

B. Random Forest

C. Gradient Boosting

D. Decision Tree

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is true about an ensembled classifier?


1. Classifiers that are more “sure” can vote with more conviction

2. Classifiers can be more “sure” about a particular part of the space

3. Most of the times, it performs better than a single classifier

1 and 2

1 and 3

2 and 3

All of the above

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following option is / are correct regarding benefits of ensemble model?

1. Better performance

2. Generalized models

3. Better interpretability

1 and 3

2 and 3

1 and 2

1, 2 and 3

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is / are true about weak learners used in ensemble model?



1. They have low variance and they don’t usually overfit

2. They have high bias, so they can not solve hard learning problems

3. They have high variance and they don’t usually overfit

1 and 2

1 and 3

2 and 3

None of these

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

If we didn't assign a base estimators to the bagging classifier it will use by default:

Linear regression

Decision tree

KNN

Logistic regression

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

In random forests, each tree in the ensemble is built from a different sample drawn with replacement (i.e. a bootstrap sample) from the training set

True

False

Don't know the answer but i like the meme

Don't know the answer & i'm not a meme person (really?)

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Ensemble methods can be divided into two groups, sequential ensemble methods & parallel ensemble methods.


for the sequential ensemble methods, the base learners (estimators) are generated sequentially (e.g. AdaBoost).

True

False

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?

Discover more resources for Computers