C2M1

C2M1

University

10 Qs

quiz-placeholder

Similar activities

Java_MCQ_3

Java_MCQ_3

University

15 Qs

PYTHON APTITUDE

PYTHON APTITUDE

University

12 Qs

Computer Networking

Computer Networking

University

10 Qs

QUIZ - CHAPTER 2

QUIZ - CHAPTER 2

University

15 Qs

AverageRound

AverageRound

University

10 Qs

EasyRound

EasyRound

University

15 Qs

Chapter 3 : SQL Command

Chapter 3 : SQL Command

University

15 Qs

LibreOffice Writer Styles Quiz

LibreOffice Writer Styles Quiz

10th Grade - University

15 Qs

C2M1

C2M1

Assessment

Quiz

Information Technology (IT)

University

Practice Problem

Medium

Created by

Abylai Aitzhanuly

Used 1+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

Media Image
Media Image
Media Image

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The dev and test set should:

  • Come from the same distribution

  • Come from the different distribution

Be identical to each other (same (x, y) pairs)

Nave the same number of example

3.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

If your Neural Network model seems to have high variance, what of the following would be promising things to try? (Check all that apply.)

Media Image
Media Image

Get more traininig data

Get more test data

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

You are working on an automated check-out kiosk for a supermarket, and are building a classifier for apples, bananas and oranges. Suppose your classifier obtains a training set error of 0.5%, and a dev set error of 7%. Which of the following are promising things to try to improve your classifier? (Check all that apply.)

Increase the regularization parameter lambda

Decrease the regularization parameter lambda

Gt more training data

Use a bigger neural network

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is weight decay?

Media Image
Media Image
  • A regularization technique (such as L2 regularization) that results in gradient descent shrinking the weights on every iteration.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What happens when you increase the regularization hyperparameter lambda?

Media Image
  • Weights are pushed toward becoming smaller (closer to 0)

  • Weights are pushed toward becoming bigger (closer to 0)

Media Image

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

With the inverted dropout technique, at test time:

  • You do not apply dropout (do not randomly eliminate units) and do not keep the 1/keep_prob factor in the calculations used in training

Media Image
Media Image
Media Image

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?