AI900-03

AI900-03

Professional Development

40 Qs

quiz-placeholder

Similar activities

Netrust Freshers Battery Exam

Netrust Freshers Battery Exam

Professional Development

40 Qs

mlarning

mlarning

Professional Development

37 Qs

Lesson 8.1 Incident Management

Lesson 8.1 Incident Management

Professional Development

40 Qs

Sec+ Study Quiz 13

Sec+ Study Quiz 13

Professional Development

42 Qs

Lvl 1 Introduction DBMS (QB)

Lvl 1 Introduction DBMS (QB)

Professional Development

39 Qs

AWS Certified Cloud Practitioner

AWS Certified Cloud Practitioner

5th Grade - Professional Development

44 Qs

AI900-01

AI900-01

Professional Development

40 Qs

Teacher assessment ML 3

Teacher assessment ML 3

8th Grade - Professional Development

40 Qs

AI900-03

AI900-03

Assessment

Quiz

Computers

Professional Development

Medium

Created by

Kevin Gilbert

Used 2+ times

FREE Resource

40 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Linear Regression matches the data on a graph to the
trend line
x-axis
y-axis
plot points

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You want to have your data be clustered into common characteristics. What would you use
unsupervised
supervised

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You want to have your data be categorized acording to labels. What would you use?
supervised
unsupervised

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The Nieve Bayesian Algorthm assumes all the predictors are __
dependent on each other
independent of each other

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You have five predictors, but one of the predictors is the most important in determining the result. What should you use?
Weighted Multiplier
Trend LIne
Plot Points

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Why is K-nearest neighbor also called lazy learning
it uses a lot of computation for every instance
It accumulates knowledge very slowly
It borrows ideas from several other models
It is very computationally efficient

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What should you do if your decision tree has too much entropy?
Add or substitute predictors
add more outcomes
choose a different root
Delete the leaves

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?