
Deep Learning: Conv Nets
Authored by Josiah Wang
Mathematics
University
CCSS covered
Used 40+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
How is shift Invariance achieved in ConvNets?
Through convolutional equivariance
Through convolutional equivariance and approximate translation invariance with pooling
Through convolutional equivariance and exact pooling invariance
They exist in a higher dimensional invariant space
Answer explanation
The convolutional layers are shift equivariant. If an input image is shifted a little bit, the convolutional filters will produce the same response at the shifted location. The pooling layers are approximately shift invariant. For example, if an input image is shifted a little bit under a max pooling layer, the maximum value will still be the same. Overall, given an input image x, a shift S, a shift equivariant convolutional layer f, and a shift invariant pooling layer g, the ConvNet g(f(x)) is shift invariant because g(f(Sx)) = g(Sf(x)) = g(f(x)). (see Note02)
2.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Why do we include dropout in the network architecture ?
Offers regularization and helps build deeper networks
Can help with uncertainty estimation through Monte-Carlo use
Increases the capacity of the model
Prevents vanishing gradients
None of these
Answer explanation
Dropout randomly removes connections between neurons in neural networks. It offers regularization through preventing the model from relying on all features, often termed co-adaption, helping reduce overfitting. In addition, Monte Carlo dropout can be used as a Bayesian approximation to estimate uncertainties in neural networks (see https://arxiv.org/abs/1506.02142).
3.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Model Ensembling is:
Having multiple instances of the network(s) and average together their responses
Having a single instance of the network and pass the input multiple times but altered in a small way
The perfect string quartet
None of the above
4.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Which of the following activation functions helps with the vanishing gradients problem?
Sigmoid
Tanh
ReLU
SELU
Softmax
Answer explanation
Hyperbolic activation functions have gradients in the range of 0-1 with significant proportions of the activation function space yielding very low gradients. As backpropagtion involves the chain role, repeatedly calculating the product of partial derivatives, the gradients passed back become vanishingly small. This does not occur in ReLUs for example as the gradient is either 0 or 1.
5.
MULTIPLE CHOICE QUESTION
45 sec • 1 pt
True or False. Two 3x3 convolutional layers have the same receptive field as one 5x5 convolutional layer, results in more non linearities and requires less weights.
True
False
Answer explanation
6.
MULTIPLE CHOICE QUESTION
45 sec • 1 pt
What causes vanishing gradients?
The Wizard Merlin
Large changes in X cause small changes in Y
Large changes in Y cause small changes in X
ReLU activations 'dying'
Answer explanation
Vanishing gradients occur when the derivative of a function becomes very close to zero, meaning large changes in input (X) cause only small changes in output (Y). This is a problem as backpropagation is done by calculating the derivatives of the error with respect to the weights, so if the derivatives are very small, the parameters will barely change, and the error will remain.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
True or False. SELUs are more likely to 'die' compared to ReLUs.
True
False
Answer explanation
ReLUs can 'die' as when inactive, below 0, they yield gradients of 0. Therefore, there is no learning signal propagating through the deactivated unit. Weights will not be updated based on any learning signal which was intended to pass through the deactivated unit. SeLUs combat this problem as have no non zero gradients therefore always yielding a learning signal.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?
Similar Resources on Wayground
10 questions
Quiz 1
Quiz
•
University
10 questions
Matrix Quiz Week 2 I - CSC C
Quiz
•
University
10 questions
Operations Research and its Applications
Quiz
•
University
10 questions
Wireless Lecture 7
Quiz
•
University
15 questions
Solve by Completing the Square
Quiz
•
9th Grade - University
10 questions
BASIC PROBABILITY
Quiz
•
University
10 questions
Quiz 4 on sets
Quiz
•
9th Grade - University
10 questions
Z-transform
Quiz
•
University
Popular Resources on Wayground
15 questions
Fractions on a Number Line
Quiz
•
3rd Grade
20 questions
Equivalent Fractions
Quiz
•
3rd Grade
25 questions
Multiplication Facts
Quiz
•
5th Grade
22 questions
fractions
Quiz
•
3rd Grade
20 questions
Main Idea and Details
Quiz
•
5th Grade
20 questions
Context Clues
Quiz
•
6th Grade
15 questions
Equivalent Fractions
Quiz
•
4th Grade
20 questions
Figurative Language Review
Quiz
•
6th Grade