
Deep Learning - Crash Course 2023 - Various Activation Functions
Interactive Video
•
University
•
Practice Problem
•
Hard
Wayground Content
FREE Resource
Read more
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the primary purpose of using TF.linspace in the setup?
To import necessary libraries
To generate random data points
To perform linear regression
To visualize data
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which activation function scales the output to the range of 0 to 1?
Softmax
ReLU
Sigmoid
Tanh
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
How does the hyperbolic tangent function differ from the sigmoid function?
It does not require bias terms
It scales output to the range of -1 to 1
It is used only in convolutional networks
It is faster than sigmoid
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the main advantage of using the ReLU activation function?
It scales output to the range of -1 to 1
It is computationally complex
It is only used in recurrent networks
It provides better accuracy and speed
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What happens to the output of ReLU when the input is less than zero?
It returns a constant value
It returns zero
It returns the input value
It returns a negative value
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which activation function is known for preventing neuron dying?
ReLU
ELU
Sigmoid
Softmax
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the formula for the Swish activation function?
X multiplied by sigmoid of X
X plus sigmoid of X
X divided by sigmoid of X
X minus sigmoid of X
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?