Week2_S2

Week2_S2

University

10 Qs

quiz-placeholder

Similar activities

Python Loop

Python Loop

University

15 Qs

Quiz Regresi Linear dan Algoritma

Quiz Regresi Linear dan Algoritma

4th Grade - University

15 Qs

ITEC 70 - Quiz # 2

ITEC 70 - Quiz # 2

University

15 Qs

Chapter Quiz

Chapter Quiz

University

10 Qs

The Linux Vault

The Linux Vault

University

15 Qs

Internet și web

Internet și web

University

10 Qs

CLC Lesson 6 Quiz

CLC Lesson 6 Quiz

University

12 Qs

Cyber-Quiz [Technical Talk]

Cyber-Quiz [Technical Talk]

University

10 Qs

Week2_S2

Week2_S2

Assessment

Quiz

Information Technology (IT)

University

Practice Problem

Easy

Created by

Samiratu Ntohsi

Used 3+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

Why is generalization more important than training accuracy in neural networks?

 Generalization proves convergence.

Training accuracy ensures bias reduction.

 It prevents vanishing gradients.

It reflects the ability to predict unseen data, the true goal.

2.

FILL IN THE BLANK QUESTION

1 min • 1 pt

A multilayer network without nonlinearities collapses into a ______ model.

3.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

When using ReLU in hidden layers instead of sigmoid, which benefit typically emerges?

  1. Guarantees linear separability in the input space.

Prevents exploding gradients.

Ensures all neurons remain active.

Reduces vanishing gradient problems

4.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

Which of the following statements about gradient descent is correct?

It always finds the global minimum.

 It updates weights by moving against the gradient of the loss function.

It requires linear separability.

It is identical to the perceptron update rule.

5.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

Why is nonlinearity essential in deep neural networks?

It guarantees zero error.

 It reduces training time.

It allows the composition of layers to model complex, non-linear boundaries.

It simplifies the optimization problem.

6.

FILL IN THE BLANK QUESTION

1 min • 1 pt

Backpropagation uses the ______ rule to propagate gradients backward through layers

7.

MULTIPLE CHOICE QUESTION

45 sec • 1 pt

In a neural network, forward propagation refers to:

Feeding inputs through the network to generate predictions

Updating weights using gradient descent

Reversing gradients to find errors

Adjusting biases to prevent saturation

Create a free account and access millions of resources

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

By signing up, you agree to our Terms of Service & Privacy Policy

Already have an account?