Neural Networks Quiz

Neural Networks Quiz

University

25 Qs

quiz-placeholder

Similar activities

Metode Penilaian Mutu Layanan Digital

Metode Penilaian Mutu Layanan Digital

University

20 Qs

Jaringan Komputer dan Internet

Jaringan Komputer dan Internet

11th Grade - University

20 Qs

Tools and Shortcut Quiz - BSIS II A

Tools and Shortcut Quiz - BSIS II A

University

20 Qs

Lei de Ohm e redes

Lei de Ohm e redes

University

20 Qs

ArangoDb quiz

ArangoDb quiz

University

20 Qs

AISB223 Chapter 12: Confidentiality and Privacy Controls

AISB223 Chapter 12: Confidentiality and Privacy Controls

University

20 Qs

Audio Interactivo-Unidad 1 y 2

Audio Interactivo-Unidad 1 y 2

University

20 Qs

Unit-1 Introduction to Cloud Computing

Unit-1 Introduction to Cloud Computing

University

20 Qs

Neural Networks Quiz

Neural Networks Quiz

Assessment

Quiz

Information Technology (IT)

University

Practice Problem

Medium

Created by

Usman Ali

Used 1+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

25 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

A simple Multi-Layer Perceptron (MLP) without any non-linear activation functions is mathematically equivalent to:

A universal function approximator.

A single, wider linear layer.

A model incapable of learning.

A support vector machine with a linear kernel.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The primary motivation for using the ReLU activation function over Sigmoid in deep hidden layers is to:

Ensure the output is always positive.

Confine the activation values between 0 and 1.

Mitigate the vanishing gradient problem.

Make the network more computationally expensive but more accurate.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The "dying ReLU" problem is characterized by:

A neuron's weights being updated such that its pre-activation input is consistently negative, causing its output and gradient to be zero.

The network becoming too deep, causing all ReLU activations to eventually become zero.

The learning rate being too low, preventing weights from being updated.

A neuron's weights exploding to infinity.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Why is breaking symmetry by initializing weights randomly (e.g., using Xavier or He initialization) crucial for training?

It guarantees faster convergence to the global minimum.

It prevents all neurons in a layer from learning the same features, as they would with zero initialization.

It acts as a form of L2 regularization.

It ensures the initial loss of the network is exactly 1.0.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary purpose of the bias term in a neuron?

To scale the output of the activation function.

To act as a learnable offset, allowing the activation function to be shifted left or right.

To prevent the weights from becoming zero during training.

To control the learning rate for that specific neuron.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The Universal Approximation Theorem suggests that:

Any neural network can solve any problem.

A single-layer perceptron can approximate any linear function.

A feed-forward network with one hidden layer and a non-linear activation can approximate any continuous function to arbitrary precision.

Deeper networks are always better than wider networks.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

A model is exhibiting high variance and low bias. This is a classic case of:

Underfitting, where the model is too simple for the data.

Overfitting, where the model has learned the training data too well, including its noise.

A well-generalized model.

A model trained with an incorrect loss function.

Create a free account and access millions of resources

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?