Which of the following are true? (Check all that apply.) Notice that I only list correct options.

C1M3

Quiz
•
Information Technology (IT)
•
University
•
Medium
Abylai Aitzhanuly
Used 1+ times
FREE Resource
10 questions
Show all answers
1.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
The tanh activation usually works better than sigmoid activation function for hidden units because the mean of its output is closer to zero, and so it centers the data better for the next layer. True/False?
TRUE
FALSE
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
You are building a binary classifier for recognizing cucumbers (y=1) vs. watermelons (y=0). Which one of these activation functions would you recommend using for the output layer?
ReLU
Leaky ReLU
sigmoid
tanh
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Consider the following code:
A = np.random.randn(4,3) B = np.sum(A, axis = 1, keepdims = True)
What will be B.shape?
B.shape = (1, 4)
B.shape = (3, 2)
B.shape = (4, 1)
B.shape = (4, 0)
6.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Suppose you have built a neural network. You decide to initialize the weights and biases to be zero. Which of the following statements are True? (Check all that apply)
Each neuron in the first hidden layer will perform the same computation. So even after multiple iterations of gradient descent each neuron in the layer will be computing the same thing as other neurons.
Each neuron in the first hidden layer will perform the same computation in the first iteration. But after one iteration of gradient descent they will learn to compute different things because we have “broken symmetry”.
Each neuron in the first hidden layer will compute the same thing, but neurons in different layers will compute different things, thus we have accomplished “symmetry breaking” as described in lecture.
The first hidden layer’s neurons will perform different computations from each other even in the first iteration; their parameters will thus keep evolving in their own way.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Logistic regression’s weights w should be initialized randomly rather than to all zeros, because if you initialize to all zeros, then logistic regression will fail to learn a useful decision boundary because it will fail to “break symmetry”, True/False?
TRUE
FALSE
Create a free account and access millions of resources
Similar Resources on Quizizz
12 questions
CLC Unit 2 Lesson 11,12 and 13 Quiz

Quiz
•
University
10 questions
Test Your Understanding!

Quiz
•
University
15 questions
Adding Interactive Elements in HTML

Quiz
•
12th Grade - University
10 questions
Quiz on Linear Discriminant Analysis and Regression

Quiz
•
University
5 questions
Rendering - sec 02

Quiz
•
University
6 questions
ANNS-06

Quiz
•
University
15 questions
CMDP2063 Unix and C Programming (Revision)

Quiz
•
University
10 questions
Quiz Chapter 3

Quiz
•
University
Popular Resources on Quizizz
15 questions
Character Analysis

Quiz
•
4th Grade
17 questions
Chapter 12 - Doing the Right Thing

Quiz
•
9th - 12th Grade
10 questions
American Flag

Quiz
•
1st - 2nd Grade
20 questions
Reading Comprehension

Quiz
•
5th Grade
30 questions
Linear Inequalities

Quiz
•
9th - 12th Grade
20 questions
Types of Credit

Quiz
•
9th - 12th Grade
18 questions
Full S.T.E.A.M. Ahead Summer Academy Pre-Test 24-25

Quiz
•
5th Grade
14 questions
Misplaced and Dangling Modifiers

Quiz
•
6th - 8th Grade