Search Header Logo

GDSC AI ML Session

Authored by Shashank Srivastava

Computers

University

Used 3+ times

GDSC AI ML Session
AI

AI Actions

Add similar questions

Adjust reading levels

Convert to real-world scenario

Translate activity

More...

    Content View

    Student View

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which icon represents Seaborn correctly from the options provided?

Media Image
Media Image
Media Image
Media Image
Media Image

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In decision tree pruning, what is the purpose of the tuning parameter (often denoted as alpha or ccp_alpha)?

To control the learning rate of the decision tree

To balance the trade-off between tree complexity and impurity reduction.

To adjust the minimum number of samples required to split a node.

To set the maximum depth of the decision tree.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

When growing a decision tree, the algorithm selects the best feature to split on at each node. What criterion is commonly used to measure the "best" split?

Mean Squared Error (MSE)

Gini impurity

Accuracy

Information gain

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You are tasked with building a decision tree for a binary classification problem. After constructing the tree, you notice that it has a depth of 25, and the training accuracy is 100%. However, when you evaluate the model on a separate test set, the accuracy drops significantly. What is the most likely reason for this discrepancy?

The model suffered from data leakage during training.

The tree is too shallow to capture the complexity of the data.

The model has overfit the training data.

The test set is not representative of the training data.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Explain the concept of decision trees in machine learning and how they handle feature selection during the learning process.

Features are ranked based on importance, and the tree chooses the best feature for splitting

Decision trees select features randomly at each node to promote diversity

Decision trees use all features during each split to maximize information gain

Decision trees only consider the target variable for feature selection

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the term "dropout" in the training process of neural networks, and how does it contribute to model generalization?

Dropout refers to removing irrelevant features during model training to reduce complexity

It is a technique for randomly deactivating neurons during training to prevent overfitting

Dropout is a regularization method specifically applied to convolutional neural networks

It denotes the gradual decrease in learning rate over epochs for stable convergence

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary advantage of using Random Forest over a single Decision Tree in terms of predictive performance?

Reduced bias

Lower variance

Improved interpretability

Faster training time

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?