iav ML Study Group Pop Quiz 2

iav ML Study Group Pop Quiz 2

1st - 2nd Grade

14 Qs

quiz-placeholder

Similar activities

MTX1

MTX1

1st Grade

15 Qs

Pre-test Sandwiches

Pre-test Sandwiches

1st - 12th Grade

10 Qs

Math & Reading Night!

Math & Reading Night!

KG - 10th Grade

10 Qs

BD Quiz Session 1

BD Quiz Session 1

1st Grade - Professional Development

10 Qs

MPs, Engagement Strategies, and Math Articulation PD

MPs, Engagement Strategies, and Math Articulation PD

KG - 8th Grade

12 Qs

RECALLING

RECALLING

2nd - 3rd Grade

16 Qs

Accuracy and Precision Science Lab

Accuracy and Precision Science Lab

1st - 3rd Grade

13 Qs

VideoGames

VideoGames

2nd Grade

10 Qs

iav ML Study Group Pop Quiz 2

iav ML Study Group Pop Quiz 2

Assessment

Quiz

Other, Mathematics

1st - 2nd Grade

Hard

Created by

Bob Balooey

Used 3+ times

FREE Resource

14 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

K-Nearest Neighbours, Random Forests and Gradient Boosting are all algorithms which...

...can only be used for Classification if there are fewer features than samples

...can only be used for Regression if there are fewer features than samples

...can and will be used for both Regression and Classification

...are only academic theories, and are not practical for real ML solutions

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

ROC stands for

Receiver Operating Curve

the mythical Roc, a giant bird from the legend of Sinbad the sailor

Regressive Orthogonal Curve

Random Omissions Cumulative

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

For a dataset with IMBALANCED classes, we would generally choose as our error metric:

Precision-Recall Curve, because we would expect a trade-off between precision and recall

ROC Curve, because it is accepted industry practice

Parabolic Curve, because the data is frequently quadratic

Straight-Line Entropy, because we want the most accurate measurement of error

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How do we determine the correct value of K for KNN?

Trick Question: KNN provides this value.

As a hyperparameter, K is extrinsic to the dataset and must be arrived at via an iterative series of experiments.

As a parameter, K is intrinsic to the dataset and can simply be read out of it.

K can be derived synthetically from other statistical measures like sigma.

5.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

We can describe upsampling and downsampling for imbalanced datasets as follows (pick all correct options):

Downsampling adds importance to the MINOR class, sending recall up and precision down

Downsampling adds importance to the MAJOR class, sending recall down and precision up

Upsampling will mitigate excessive weight on the MINOR class. Recall will still be higher than precision, but the gap will lessen

Upsampling will mitigate excessive weight on the MAJOR class. Precision will still be higher than recall, but the gap will lessen

6.

FILL IN THE BLANK QUESTION

1 min • 1 pt

These are the four steps in blagging, possibly out-of-order:

1) BALANCE each sample by downsampling

2) MAJORITY vote

3) BOOTSTRAP samples from the population

4) LEARN a decision tree


To answer the question, put all 4 steps in order with no spaces in the box e.g. "6958"

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The broadest and most flexible ensemble method, which involves the mixing of an arbitrary number of models without any limits on their types is called:

Piling

Stacking

Chaining

Combining

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?