GDSC AI ML Session

GDSC AI ML Session

University

10 Qs

quiz-placeholder

Similar activities

04 - Supervised Machine Learning - Classification

04 - Supervised Machine Learning - Classification

University - Professional Development

7 Qs

Quiz on Gradient Boosting and Ensemble Methods

Quiz on Gradient Boosting and Ensemble Methods

University

7 Qs

QUIZ1

QUIZ1

University

6 Qs

Intro to Machine Learning Quiz

Intro to Machine Learning Quiz

University

10 Qs

WS2324 S2 & S10 Formative Assessment

WS2324 S2 & S10 Formative Assessment

University

15 Qs

Predictive Analytics

Predictive Analytics

University

10 Qs

Computer Science Fundamentals - Review And Reflect

Computer Science Fundamentals - Review And Reflect

7th Grade - Professional Development

8 Qs

Algorithm Quiz : Which algorithm is the best fit ?

Algorithm Quiz : Which algorithm is the best fit ?

University

8 Qs

GDSC AI ML Session

GDSC AI ML Session

Assessment

Quiz

Computers

University

Hard

Created by

Shashank Srivastava

Used 3+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which icon represents Seaborn correctly from the options provided?

Media Image
Media Image
Media Image
Media Image
Media Image

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In decision tree pruning, what is the purpose of the tuning parameter (often denoted as alpha or ccp_alpha)?

To control the learning rate of the decision tree

To balance the trade-off between tree complexity and impurity reduction.

To adjust the minimum number of samples required to split a node.

To set the maximum depth of the decision tree.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

When growing a decision tree, the algorithm selects the best feature to split on at each node. What criterion is commonly used to measure the "best" split?

Mean Squared Error (MSE)

Gini impurity

Accuracy

Information gain

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You are tasked with building a decision tree for a binary classification problem. After constructing the tree, you notice that it has a depth of 25, and the training accuracy is 100%. However, when you evaluate the model on a separate test set, the accuracy drops significantly. What is the most likely reason for this discrepancy?

The model suffered from data leakage during training.

The tree is too shallow to capture the complexity of the data.

The model has overfit the training data.

The test set is not representative of the training data.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Explain the concept of decision trees in machine learning and how they handle feature selection during the learning process.

Features are ranked based on importance, and the tree chooses the best feature for splitting

Decision trees select features randomly at each node to promote diversity

Decision trees use all features during each split to maximize information gain

Decision trees only consider the target variable for feature selection

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the term "dropout" in the training process of neural networks, and how does it contribute to model generalization?

Dropout refers to removing irrelevant features during model training to reduce complexity

It is a technique for randomly deactivating neurons during training to prevent overfitting

Dropout is a regularization method specifically applied to convolutional neural networks

It denotes the gradual decrease in learning rate over epochs for stable convergence

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary advantage of using Random Forest over a single Decision Tree in terms of predictive performance?

Reduced bias

Lower variance

Improved interpretability

Faster training time

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?