Optimizers

Optimizers

Professional Development

6 Qs

quiz-placeholder

Similar activities

Applied Operations Research Surprise Quiz 18th March 2024

Applied Operations Research Surprise Quiz 18th March 2024

Professional Development

10 Qs

การตัดรูปภาพ

การตัดรูปภาพ

Professional Development

10 Qs

Digital Multimeters (Fluke 87V)

Digital Multimeters (Fluke 87V)

Professional Development

10 Qs

Cùng tìm hiểu các công cụ trong Photoshop

Cùng tìm hiểu các công cụ trong Photoshop

Professional Development

10 Qs

Amdocs OSS - ODO Implementation for Vodafone UK

Amdocs OSS - ODO Implementation for Vodafone UK

Professional Development

10 Qs

Quiz 14

Quiz 14

Professional Development

10 Qs

Technical Analysis Bonds

Technical Analysis Bonds

Professional Development

8 Qs

SWAG2.0 MR PT 1

SWAG2.0 MR PT 1

Professional Development

5 Qs

Optimizers

Optimizers

Assessment

Quiz

Mathematics, Computers, Professional Development

Professional Development

Practice Problem

Hard

Created by

Mariam Metawe3

Used 3+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

6 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

__________ responsible to update weights in the back propagation to minimize the loss function

Optimizers

PCA

input layer

hidden layers

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

___________ is the newest fact and most efficient optimizer up till now

Gradient descent

Stochastic Gradient descent

Adam

adagrad

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Gradient descent needs more computational powerthan other optimizers.

True

False

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

_________________ use the exponential weighted average to smooth the noisy result of Stochastic gradient descent.

Stochastic gradient descent with momentum

Gradient descent

adagrad (adaptive gradient descent)

Ada delta and RMS prop

5.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

___________ use a NON fixed learning rate

(choose all the possible answers)

Adagrad

Ada delta and RMS prop

Adam

Stochastic gradient descent

stochastic gradient descent with momentum

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

__________ is a combination of SGD with momentum for smoothing and RMS prop for efficient Learning rate.

Adam

Gradient descent

Ada delta

SGD

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?