Recurrent Neural Networks and Transformer Models

Recurrent Neural Networks and Transformer Models

Assessment

Flashcard

Computers

9th Grade

Practice Problem

Easy

Created by

Abhishek Sharma

Used 1+ times

FREE Resource

Student preview

quiz-placeholder

11 questions

Show all answers

1.

FLASHCARD QUESTION

Front

What are recurrent neural networks commonly used for?

Back

Sequence modeling and transduction problems such as language modeling and machine translation.

2.

FLASHCARD QUESTION

Front

What are the two types of recurrent neural networks mentioned?

Back

Long short-term memory (LSTM) and gated recurrent neural networks.

3.

FLASHCARD QUESTION

Front

What is a key limitation of recurrent models in training?

Back

The inherently sequential nature precludes parallelization within training examples.

4.

FLASHCARD QUESTION

Front

What recent advancements have improved computational efficiency in recurrent models?

Back

Factorization tricks and conditional computation.

5.

FLASHCARD QUESTION

Front

What role do attention mechanisms play in sequence modeling?

Back

They allow modeling of dependencies without regard to their distance in the input or output sequences.

6.

FLASHCARD QUESTION

Front

What is the main innovation of the Transformer model?

Back

It relies entirely on an attention mechanism to draw global dependencies between input and output, eschewing recurrence.

7.

FLASHCARD QUESTION

Front

How does the Transformer model improve parallelization?

Back

By not relying on recurrence, it allows for significantly more parallelization.

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?

Discover more resources for Computers