Quiz-1(G3)

Quiz-1(G3)

University

10 Qs

quiz-placeholder

Similar activities

Data Science Quiz

Data Science Quiz

University

12 Qs

III Year - UNIT I - Data Minig

III Year - UNIT I - Data Minig

University

10 Qs

OS QUIZ(UNIT-III)

OS QUIZ(UNIT-III)

University

10 Qs

Data Mining

Data Mining

University

15 Qs

Computer Security:Encryption

Computer Security:Encryption

10th Grade - University

10 Qs

Engineering ACW Semester 2 - #5 AI Part 2

Engineering ACW Semester 2 - #5 AI Part 2

University

15 Qs

Unit 2 - AI by Prof. R. Rajkumar

Unit 2 - AI by Prof. R. Rajkumar

University

15 Qs

DataQuest_Quiz

DataQuest_Quiz

University

15 Qs

Quiz-1(G3)

Quiz-1(G3)

Assessment

Quiz

Computers

University

Practice Problem

Hard

Created by

Dr Kumar

Used 1+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

FILL IN THE BLANK QUESTION

1 min • 1 pt

Let L3 = {{A, B, C}, {A, B, D}, {A, C, D}, {B, C, D}}. How many candidate 4-itemsets (C4) will be generated using the F3 × F3 method before pruning?

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following best describes the computational cost associated with Fk-1 × Fk-1 candidate generation as k increases?

It decreases linearly

It increases exponentially due to subset checks

It remains constant for large k

It becomes negligible due to pruning

3.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

In the context of candidate generation using Fk-1 × Fk-1, pruning is performed to:

Ensure the candidate is lexicographically sorted

Eliminate candidates containing any infrequent (k–1)-subset

Improve algorithm efficiency by reducing support count operations

Guarantee that only maximal itemsets are retained

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

In rule generation from frequent itemsets in the Apriori algorithm:

All non-empty subsets of a frequent itemset are considered for rule generation

Confidence is calculated as support(X ∪ Y) / support(X)

The lift of a rule is always greater than 1 for strong rules

Rules are retained only if they meet both minimum support and minimum confidence thresholds

5.

FILL IN THE BLANK QUESTION

1 min • 1 pt

When using the Apriori algorithm, generating rules from a frequent itemset of size k can result in up to __________ rules.

6.

FILL IN THE BLANK QUESTION

1 min • 1 pt

How many times is the original transaction database scanned in FP-Growth algorithm?

7.

FILL IN THE BLANK QUESTION

1 min • 1 pt

The path from any node to the root in an FP-tree represents a __________ of a transaction.

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?

Discover more resources for Computers