Intro to ML: Decision Trees

Intro to ML: Decision Trees

University

7 Qs

quiz-placeholder

Similar activities

Program G: Classifiers

Program G: Classifiers

12th Grade - University

10 Qs

BAN2022_CH4: Classification Part I Introduction and Evaluation (

BAN2022_CH4: Classification Part I Introduction and Evaluation (

University

12 Qs

Quiz Arsitektur dan Model Data Mining

Quiz Arsitektur dan Model Data Mining

University

10 Qs

10 Questions of Machine Learning

10 Questions of Machine Learning

University

10 Qs

Machine Learning Basics

Machine Learning Basics

University

10 Qs

Classification

Classification

University

10 Qs

Predictive Analytics

Predictive Analytics

University

10 Qs

Intro to ML: Decision Trees

Intro to ML: Decision Trees

Assessment

Quiz

Computers

University

Hard

Created by

Josiah Wang

Used 59+ times

FREE Resource

7 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Which system requires more entropy to be described?

Tossing a fair coin

Tossing a biased coin

Answer explanation

Tossing a fair coin requires more entropy to be described because it has equal probability for each outcome, while tossing a biased coin has a higher probability for one outcome.

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Is it possible to test the same attribute twice along the same path on a decision tree for a categorical problem?

Yes

No

Answer explanation

No, because the purpose of the tree is to efficiently split and classify data using different features. If the same attribute were tested twice along the same path, it wouldn't contribute any additional information to the decision process, and it could potentially lead to redundancy or overfitting.

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Is it possible that the same attribute will get selected twice in an ordinal or real-valued problem?

Yes

No

Answer explanation

It is possible for the same attribute to be selected more than once along the same path. This can happen when the tree is designed to create branches that split the data into different ranges of values for the same attribute.

4.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Decision trees are an algorithm for which machine learning task?

Clustering

Classification

Classification and Regression

Dimensionality reduction

Regression

5.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

When a tree is significantly deep, what does it indicate?

The samples have a large number of attributes

The dataset is possibly noisy

The tree under-fits the training data

None of these

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Which of the following is true:

Deeper trees will always improve performance on the training data

Deeper trees will always improve performance when testing the model on unseen data

If a deeper tree improves performance on the training data, then it will also improve performance on new unseen data

7.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

For a binary classification problem with a balanced dataset, if one feature completely determines the class for each observation, what is the information gain from using this feature as the first node in a tree?

0

0.5

1

Not enough information