Are Bigger Language Models Better? | DeepMind Gopher and RETRO

Are Bigger Language Models Better? | DeepMind Gopher and RETRO

Assessment

Interactive Video

Created by

Quizizz Content

Computers

11th Grade - University

Hard

7 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the parameter size of DeepMind's Gopher model?

530 billion

208 billion

175 billion

100 billion

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which model is larger than Gopher in terms of parameters?

BERT

RETRO

Megatron Turning

GPT-3

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In which areas did Gopher show improved performance?

Art and Design

STEM and Medicine

Sports and Entertainment

Finance and Economics

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What issue does Gopher still face despite increased size?

Limited data storage

Increased computational cost

Higher toxicity and bias

Decreased accuracy

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How does the CO2 emission from training Gopher compare to a transatlantic flight?

Less than a flight

No emissions

Equal to a flight

More than a flight

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the parameter size of the RETRO model?

530 billion

7 billion

175 billion

208 billion

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How does RETRO improve the quality of its output?

By increasing parameters

By using a retrieval database

By reducing training time

By using more data