Transformer Quiz

Transformer Quiz

University

10 Qs

quiz-placeholder

Similar activities

Modelo OSI e TCP/IP (2)

Modelo OSI e TCP/IP (2)

10th Grade - University

10 Qs

Unit: IV MongoDB and Node.js Quiz

Unit: IV MongoDB and Node.js Quiz

12th Grade - University

14 Qs

FHCT1012 Computing Technology Sample MIdterm questions

FHCT1012 Computing Technology Sample MIdterm questions

University

15 Qs

Cloud computing quiz

Cloud computing quiz

University

10 Qs

Cloud Computing Quiz

Cloud Computing Quiz

12th Grade - University

15 Qs

Deep learning Batch 1

Deep learning Batch 1

University

10 Qs

DPMQ2

DPMQ2

University

15 Qs

EXCEL TEST

EXCEL TEST

University

10 Qs

Transformer Quiz

Transformer Quiz

Assessment

Quiz

Information Technology (IT)

University

Hard

Created by

Elakkiya E

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the main problem with Recurrent Neural Networks (RNN)?

Fast computation for long sequences

No problems at all

Difficulty in accessing information from long time ago

No issues with gradients

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of positional encoding in the Transformer model?

To increase the model size

To add noise to the data

To represent the position of words in a sentence

To remove unnecessary words

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the function of Self-Attention in the Transformer model?

To add random words

To remove all words except the most important one

To ignore certain words

To relate words to each other

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of layer normalization in the Transformer model?

To speed up computation

To make the model causal

To introduce fluctuations in the data

To remove all layers except the last one

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the goal of Masked Multi-Head Attention in the Transformer model?

To make the model causal

To remove attention from the model

To focus on future words only

To ignore the first word

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which task is typically associated with an Encoder Only Transformer?

Machine translation

Text summarization

Sentimental analysis

Text generation

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which task is typically associated with a Decoder Only Transformer?

Chatbots

Question-answering

Text classification

Anomaly detection

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?