What is the "cache" used for in our implementation of forward propagation and backward propagation?

C1M4

Quiz
•
Information Technology (IT)
•
University
•
Medium
Abylai Aitzhanuly
Used 1+ times
FREE Resource
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
It is used to cache the intermediate values of the cost function during training.
We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.
It is used to keep track of the hyperparameters that we are searching over, to speed up computation.
We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.
2.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Among the following, which ones are "hyperparameters"? (Check all that apply.)
learning rate α
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following statements is true?
The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.
The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Vectorization allows you to compute forward propagation in an L-layer neural network without an explicit for-loop (or any other explicit iterative loop) over the layers l=1, 2, …,L. True/False?
TRUE
FALSE
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Assume we store the values for n^[l] in an array called layers, as follows: layer_dims = [n_x, 4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?
for(i in range(1, len(layer_dims))):
parameter[‘W’ + str(i)] = np.random.randn(layers[i], layers[i - 1])) * 0.01
parameter[‘b’ + str(i)] = np.random.randn(layers[i], 1)
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
The number of layers L is 4. The number of hidden layers is 2.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
During forward propagation, in the forward function for a layer l you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?
TRUE
FALSE
Create a free account and access millions of resources
Similar Resources on Quizizz
14 questions
Network Protocols Quiz

Quiz
•
University
15 questions
ISP and Data Packets Part 2

Quiz
•
University
15 questions
TCP/IP Model Quiz

Quiz
•
University
10 questions
Multiple Choice Quiz: Designing for Impact

Quiz
•
University
15 questions
Huawei Security

Quiz
•
University
15 questions
Network Topologies and Key Network Components

Quiz
•
10th Grade - University
10 questions
CSE QUIZ I.T QUIZ BEE - AVERAGE ROUND

Quiz
•
University
10 questions
International ICT Policies in Education

Quiz
•
University
Popular Resources on Quizizz
15 questions
Character Analysis

Quiz
•
4th Grade
17 questions
Chapter 12 - Doing the Right Thing

Quiz
•
9th - 12th Grade
10 questions
American Flag

Quiz
•
1st - 2nd Grade
20 questions
Reading Comprehension

Quiz
•
5th Grade
30 questions
Linear Inequalities

Quiz
•
9th - 12th Grade
20 questions
Types of Credit

Quiz
•
9th - 12th Grade
18 questions
Full S.T.E.A.M. Ahead Summer Academy Pre-Test 24-25

Quiz
•
5th Grade
14 questions
Misplaced and Dangling Modifiers

Quiz
•
6th - 8th Grade