Search Header Logo
Are Bigger Language Models Better? | DeepMind Gopher and RETRO

Are Bigger Language Models Better? | DeepMind Gopher and RETRO

Assessment

Interactive Video

Computers

11th Grade - University

Practice Problem

Hard

Created by

Wayground Content

FREE Resource

7 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the parameter size of DeepMind's Gopher model?

530 billion

208 billion

175 billion

100 billion

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which model is larger than Gopher in terms of parameters?

BERT

RETRO

Megatron Turning

GPT-3

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In which areas did Gopher show improved performance?

Art and Design

STEM and Medicine

Sports and Entertainment

Finance and Economics

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What issue does Gopher still face despite increased size?

Limited data storage

Increased computational cost

Higher toxicity and bias

Decreased accuracy

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How does the CO2 emission from training Gopher compare to a transatlantic flight?

Less than a flight

No emissions

Equal to a flight

More than a flight

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the parameter size of the RETRO model?

530 billion

7 billion

175 billion

208 billion

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How does RETRO improve the quality of its output?

By increasing parameters

By using a retrieval database

By reducing training time

By using more data

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?