Week2_S2

Week2_S2

University

10 Qs

quiz-placeholder

Similar activities

QUIZ - CHAPTER 2

QUIZ - CHAPTER 2

University

15 Qs

CS10337 - Lecture # 9

CS10337 - Lecture # 9

University

10 Qs

AverageRound

AverageRound

University

10 Qs

EasyRound

EasyRound

University

15 Qs

Business Intelligence Quiz

Business Intelligence Quiz

University

10 Qs

Chapter 3 : SQL Command

Chapter 3 : SQL Command

University

15 Qs

LibreOffice Writer Styles Quiz

LibreOffice Writer Styles Quiz

10th Grade - University

15 Qs

Java_MCQ_3

Java_MCQ_3

University

15 Qs

Week2_S2

Week2_S2

Assessment

Quiz

Information Technology (IT)

University

Practice Problem

Easy

Created by

Samiratu Ntohsi

Used 3+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

Why is generalization more important than training accuracy in neural networks?

 Generalization proves convergence.

Training accuracy ensures bias reduction.

 It prevents vanishing gradients.

It reflects the ability to predict unseen data, the true goal.

2.

FILL IN THE BLANK QUESTION

1 min • 1 pt

A multilayer network without nonlinearities collapses into a ______ model.

3.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

When using ReLU in hidden layers instead of sigmoid, which benefit typically emerges?

  1. Guarantees linear separability in the input space.

Prevents exploding gradients.

Ensures all neurons remain active.

Reduces vanishing gradient problems

4.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

Which of the following statements about gradient descent is correct?

It always finds the global minimum.

 It updates weights by moving against the gradient of the loss function.

It requires linear separability.

It is identical to the perceptron update rule.

5.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

Why is nonlinearity essential in deep neural networks?

It guarantees zero error.

 It reduces training time.

It allows the composition of layers to model complex, non-linear boundaries.

It simplifies the optimization problem.

6.

FILL IN THE BLANK QUESTION

1 min • 1 pt

Backpropagation uses the ______ rule to propagate gradients backward through layers

7.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

In a neural network, forward propagation refers to:

Feeding inputs through the network to generate predictions

Updating weights using gradient descent

Reversing gradients to find errors

Adjusting biases to prevent saturation

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?