
Deep Learning quiz 2
Authored by joshua igoni
Mathematics, Science, Computers
1st - 12th Grade
Used 23+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
12 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Batch normalization helps to prevent?
Activation function to become too high or low
The training speed to become too slow
Both A and B
None
2.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Convolutional Neural Network is used in?
Image classification
Text Classification
Computer vission
All of the above
3.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Which of the following Neural networks has a memory?
1D CNN
2D CNN
LSTM
None
4.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
In Neural Networks, which of the following causes the loss not to decrease faster?
Stuck at a local minima
High regularization parameter
Slow learning rate
All of the above
5.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Which of the following activation function can not be used in the output layer of an image classification model?
ReLu
Softmax
Sigmoid
None
6.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Suppose you are having a dataset from where you have to predict three classes. Then which of the following configuration should you use in the output layer?
Activation function = softmax, loss function = cross entropy
Activation function = sigmoid, loss function = cross entropy
Activation function = softmax, loss function = mean squared error
Activation function = sigmoid, loss function = mean squared error
7.
MULTIPLE CHOICE QUESTION
1 min • 1 pt
Which of the following is true about bias?
Bias is inherent in any predictive model
Bias impacts the output of the neurons
Both A and B
None
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?