
C1M4

Quiz
•
Information Technology (IT)
•
University
•
Medium
Abylai Aitzhanuly
Used 1+ times
FREE Resource
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the "cache" used for in our implementation of forward propagation and backward propagation?
It is used to cache the intermediate values of the cost function during training.
We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.
It is used to keep track of the hyperparameters that we are searching over, to speed up computation.
We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.
2.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
Among the following, which ones are "hyperparameters"? (Check all that apply.)
learning rate α
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following statements is true?
The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.
The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Vectorization allows you to compute forward propagation in an L-layer neural network without an explicit for-loop (or any other explicit iterative loop) over the layers l=1, 2, …,L. True/False?
TRUE
FALSE
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Assume we store the values for n^[l] in an array called layers, as follows: layer_dims = [n_x, 4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?
for(i in range(1, len(layer_dims))):
parameter[‘W’ + str(i)] = np.random.randn(layers[i], layers[i - 1])) * 0.01
parameter[‘b’ + str(i)] = np.random.randn(layers[i], 1)
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
The number of layers L is 4. The number of hidden layers is 2.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
During forward propagation, in the forward function for a layer l you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?
TRUE
FALSE
Create a free account and access millions of resources
Similar Resources on Wayground
15 questions
Network Access Layer Quiz

Quiz
•
University
15 questions
IoT Fundamentals Quiz

Quiz
•
University
15 questions
Test

Quiz
•
University
10 questions
C4M2

Quiz
•
University
10 questions
Deep learning Batch 1

Quiz
•
University
14 questions
ICT and Multimedia Quiz

Quiz
•
12th Grade - University
10 questions
S01 - Deep Learning

Quiz
•
University
13 questions
Exploring Computer Network Models

Quiz
•
10th Grade - University
Popular Resources on Wayground
18 questions
Writing Launch Day 1

Lesson
•
3rd Grade
11 questions
Hallway & Bathroom Expectations

Quiz
•
6th - 8th Grade
11 questions
Standard Response Protocol

Quiz
•
6th - 8th Grade
40 questions
Algebra Review Topics

Quiz
•
9th - 12th Grade
4 questions
Exit Ticket 7/29

Quiz
•
8th Grade
10 questions
Lab Safety Procedures and Guidelines

Interactive video
•
6th - 10th Grade
19 questions
Handbook Overview

Lesson
•
9th - 12th Grade
20 questions
Subject-Verb Agreement

Quiz
•
9th Grade