Search Header Logo

C1M4

Authored by Abylai Aitzhanuly

Information Technology (IT)

University

Used 1+ times

C1M4
AI

AI Actions

Add similar questions

Adjust reading levels

Convert to real-world scenario

Translate activity

More...

    Content View

    Student View

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the "cache" used for in our implementation of forward propagation and backward propagation?

It is used to cache the intermediate values of the cost function during training.

We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.

It is used to keep track of the hyperparameters that we are searching over, to speed up computation.

We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.

2.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Among the following, which ones are "hyperparameters"? (Check all that apply.)

Media Image
Media Image
  • learning rate α

Media Image
Media Image

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following statements is true?

The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.

The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Vectorization allows you to compute forward propagation in an L-layer neural network without an explicit for-loop (or any other explicit iterative loop) over the layers l=1, 2, …,L. True/False?

TRUE

FALSE

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Assume we store the values for n^[l] in an array called layers, as follows: layer_dims = [n_x, 4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?

Media Image
Media Image
Media Image

for(i in range(1, len(layer_dims))):

parameter[‘W’ + str(i)] = np.random.randn(layers[i], layers[i - 1])) * 0.01

parameter[‘b’ + str(i)] = np.random.randn(layers[i], 1)

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

Media Image
Media Image
Media Image
  • The number of layers L is 4. The number of hidden layers is 2.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

During forward propagation, in the forward function for a layer l you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?

TRUE

FALSE

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?