If we predict every observation to be True, what will our model precision be?
Intro to ML: The ML Revision Quiz

Quiz
•
Computers
•
University
•
Hard

Josiah Wang
Used 20+ times
FREE Resource
11 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
100%
0%
The proportion of True values in the dataset
Not enough information
Answer explanation
Think of all the False Positives
2.
MULTIPLE SELECT QUESTION
1 min • 1 pt
James, Amelia, and George are participating in a machine learning competition. They have to choose an algorithm for their project. Select which of the following algorithms they should consider if they want to use eager learners:
K-nearest neighbours
Decision trees
Neural networks
Linear regression
Answer explanation
Recall that K-nn is a lazy learner. At training time the algorithm simply stores the training data - i.e. no calculations/training occurs. It is not until inference time when the algorithm checks the K nearest points to the unseen datapoint in question. All calculations occur at inference time, hence being called lazy rather than eager.
3.
MULTIPLE SELECT QUESTION
1 min • 1 pt
Which of the following statements are True:
Performance on the validation set can be used to see if a model is overfitting to the training data
We cannot tell from the training performance alone if a model is overfitting or not
Underfitting implies better generalisation to other datasets
Answer explanation
Underfitting is when the model lacks the capacity to fit the underlying pattern/trend of the data. A model that underfits a training set will perform no better on unseen data.
4.
MULTIPLE SELECT QUESTION
1 min • 1 pt
Scarlett is working on a machine learning project and she is worried about underfitting. Which of the following actions may cause underfitting in her model?
Reducing the max. depth of a decision tree
Increasing the value of K in K-nn
Adding more layers to a neural network
Increasing the size of the training data
Increasing the value of K in K-means
Answer explanation
Underfitting is caused when the model lacks the capacity to fit the underlying trend/pattern of the data.
5.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
True or False:
If we use grid-search for testing different hyper-parameter values, we can use each of these results for finding the confidence interval of the model error.
True
False
Answer explanation
Confidence Intervals should be reported for the final model architecture. They are a prediction on how the final model will perform on unseen data. If this is calculated with a range of different models, this will clearly be an inaccurate prediction.
6.
MULTIPLE SELECT QUESTION
1 min • 1 pt
Which of the following algorithms will change given different random seeds:
Neural networks
K-nearest neighbours (K = 1, with no ties)
Decision trees
K-means
Evolution Algorithms using simple tournament
Answer explanation
Think about which methods are deterministic.
7.
MULTIPLE SELECT QUESTION
1 min • 1 pt
Which statements below are True describing the differences between Gradient Descent, Stochastic Gradient Descent and Mini-batched Gradient Descent:
Gradient Descent is faster to compute than Stochastic Gradient Descent
Stochastic Gradient Descent is faster to compute than Mini-batched Gradient Descent
There is less noise in the gradients when using Mini-batched Gradient Descent compared to Stochastic Gradient Descent
Answer explanation
Gradient descent - gradients are calculated and a step is taken based on the whole training step. This is a computationally heavy computation, as apposed to calculating the gradients and updating the parameters based on one sample. Stochastic gradient descent results in a very noising learning signal as the gradient is sensitive to the variability of each individual datapoint. The learning signal can be smoothed out by sampling groups of datapoints in mini-batch gradient descent. Here the gradient is averaged over all datapoints.
Create a free account and access millions of resources
Similar Resources on Quizizz
10 questions
ACM AI Projects Week 3 Kahoot

Quiz
•
University
6 questions
Intro to ML: Neural Networks Lecture 2 Part 2

Quiz
•
University
15 questions
BrainOn Quiz

Quiz
•
University
15 questions
ML Chapter 06

Quiz
•
University
10 questions
AIGA

Quiz
•
University
12 questions
ML hackathon QUiz

Quiz
•
University
9 questions
Intro to ML: K-Nearest Neighbours

Quiz
•
University
10 questions
Compiler Parsing Techniques

Quiz
•
University
Popular Resources on Quizizz
15 questions
Character Analysis

Quiz
•
4th Grade
17 questions
Chapter 12 - Doing the Right Thing

Quiz
•
9th - 12th Grade
10 questions
American Flag

Quiz
•
1st - 2nd Grade
20 questions
Reading Comprehension

Quiz
•
5th Grade
30 questions
Linear Inequalities

Quiz
•
9th - 12th Grade
20 questions
Types of Credit

Quiz
•
9th - 12th Grade
18 questions
Full S.T.E.A.M. Ahead Summer Academy Pre-Test 24-25

Quiz
•
5th Grade
14 questions
Misplaced and Dangling Modifiers

Quiz
•
6th - 8th Grade