tech_quiz

tech_quiz

Professional Development

10 Qs

quiz-placeholder

Similar activities

Pehchan Kaun???

Pehchan Kaun???

Professional Development

10 Qs

SIMPLIFICATION/APPROXIMATION

SIMPLIFICATION/APPROXIMATION

12th Grade - Professional Development

10 Qs

FACTORS AND MULTIPLES

FACTORS AND MULTIPLES

1st Grade - Professional Development

10 Qs

SIDP Recap

SIDP Recap

Professional Development

10 Qs

Numerical Methods

Numerical Methods

University - Professional Development

15 Qs

Clocks and Calendars

Clocks and Calendars

11th Grade - Professional Development

15 Qs

In Center of Gravity

In Center of Gravity

Professional Development

10 Qs

Rational Numbers - Monday

Rational Numbers - Monday

Professional Development

10 Qs

tech_quiz

tech_quiz

Assessment

Quiz

Computers

Professional Development

Practice Problem

Hard

Created by

Sai Akella

Used 4+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

20 sec • 10 pts

Media Image

A

B

C

D

E

2.

MULTIPLE CHOICE QUESTION

10 sec • 2 pts

What is the purpose of setting the model to evaluation mode with model.eval() in PyTorch?

To initialize the model parameters.

To disable gradient computation during training.

To ensure layers like dropout and batch normalization behave correctly during inference.

To load the pre-trained weigths of the model.

3.

MULTIPLE SELECT QUESTION

10 sec • 2 pts

Which of the following code snippets correctly initializes a ResNet50 model without pre-trained weights in PyTorch?

Media Image
Media Image
Media Image
Media Image

4.

MULTIPLE CHOICE QUESTION

10 sec • 5 pts

Media Image

What does the following code snippet do in the context of PyTorch model inferencing?

It disables gradient computation and performs inference.

It initializes the model parameters without gradient computation.

It enables gradient computation and performs inference.

It computes gradients and performs inference.

5.

MULTIPLE CHOICE QUESTION

20 sec • 5 pts

What is the primary advantage of using model parallelism in large language model inference?

Reducing the inference time by processing multiple inputs in parallel.

Distributing the model's computations across multiple GPUs to handle large models that cannot fit into the memory of a single GPU.

Increasing the precision of the model's predictions.

Improving the training speed of the model.

6.

MULTIPLE CHOICE QUESTION

10 sec • 10 pts

What is the purpose of the torch.no_grad() context in the inference of large language models?

To enable gradient computation for backpropagation.

To reduce memory usage by disabling gradient computation.

To speed up the training process.

To initialize model weights.

7.

MULTIPLE CHOICE QUESTION

20 sec • 5 pts

When using a pre-trained large language model for inference, what is the recommended way to handle tokenization?

Implement a custom tokenization algorithm.

Use the tokenization method provided by the model's pre-trained package.

Use a generic tokenization method.

Use a generic tokenization method.

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?