sem6_ai_week_8

sem6_ai_week_8

University

48 Qs

quiz-placeholder

Similar activities

Equipamentos de redes

Equipamentos de redes

University - Professional Development

50 Qs

Data Comms Reviewer for UT2 Part 2

Data Comms Reviewer for UT2 Part 2

University

50 Qs

Web App 2

Web App 2

12th Grade - University

46 Qs

I/O and Operators in C

I/O and Operators in C

University

43 Qs

CBT303_Quiz3

CBT303_Quiz3

University

50 Qs

An toàn Web

An toàn Web

University

43 Qs

Compiler Quiz 2

Compiler Quiz 2

University

53 Qs

Nhập môn AI

Nhập môn AI

University

43 Qs

sem6_ai_week_8

sem6_ai_week_8

Assessment

Quiz

Computers

University

Hard

Created by

Sujan Pandey

Used 1+ times

FREE Resource

48 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which preprocessing step converts “Running” to “run”?

Lowercasing for word simplification

Stop‑word removal operation applied

Stemming using Porter method

Lemmatization with vocabulary lookup

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In a bag‑of‑words model the feature vector length equals the _____?

Number of processed documents there

Count of sentences per corpus

Size of chosen stop‑word list

Total unique tokens in vocabulary

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The TF component in TF‑IDF is commonly computed as _____?

Binary presence for each term

Log of inverse term length

Raw position index in text

Term frequency within document

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Given N documents, IDF = log (N / df) where df is _____?

Document format numeric code

Desired feature vector size

Distinct feature hash bucket

Number of docs containing term

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

One‑hot vectors are extremely _____ compared with dense embeddings.

Computationally fast for GPUs

Contextually aware of semantics

Compact due to low dimension

Sparse and memory inefficient

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Word2Vec’s Skip‑gram objective maximizes probability of _____?

Document class given features

Global co‑occurrence statistics

Subword n‑gram predictions

Context words given target word

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In negative sampling, “k” most commonly refers to _____?

Kernel width inside network

Layers stacked in the model

Context window size chosen

Randomly drawn negative samples

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?