DL-Activation Functions

DL-Activation Functions

University

10 Qs

quiz-placeholder

Similar activities

Rounding Numbers in Python

Rounding Numbers in Python

University

9 Qs

Menu-Menu pada Microsoft Excel Kelas A

Menu-Menu pada Microsoft Excel Kelas A

7th Grade - University

10 Qs

Pemkom 8

Pemkom 8

University

10 Qs

UTILITY PROGRAM

UTILITY PROGRAM

University

10 Qs

Digital Technology Office 365 - 29May2020

Digital Technology Office 365 - 29May2020

University - Professional Development

13 Qs

Quiz3_DivideConquer_GreedyApproach

Quiz3_DivideConquer_GreedyApproach

University

10 Qs

Computer and Other Human Inventions

Computer and Other Human Inventions

University

15 Qs

Python with DataScience

Python with DataScience

7th Grade - University

10 Qs

DL-Activation Functions

DL-Activation Functions

Assessment

Quiz

Computers

University

Practice Problem

Hard

Created by

lawrance r

Used 1+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary purpose of an activation function in a neural network?

To initialize weights

To introduce non-linearity

To adjust learning rate

To increase the number of neurons

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which activation function is commonly used in output layers for binary classification?

ReLU

Tanh

Sigmoid

Softmax

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the range of the sigmoid activation function?

(-∞, ∞)

(-1, 1)

(0, 1)

(0, ∞)

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which activation function is prone to the vanishing gradient problem?

ReLU

Leaky ReLU

Softmax

Sigmoid

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the output range of the Tanh activation function?

(0, 1)

(-∞, ∞)

(-1, 1)

(0, ∞)

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which activation function is commonly used in hidden layers of deep networks?

ReLU

Sigmoid

Tanh

Step function

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does ReLU stand for?

Residual Learning Unit

Regularized Linear Update

Recursive Linear Unit

Rectified Linear Unit

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?

Discover more resources for Computers