

LARGE LANGUAGE MODELS (LLMs)
Presentation
•
English
•
5th Grade
•
Practice Problem
•
Medium
+16
Standards-aligned
Nabilah Zin
Used 2+ times
FREE Resource
38 Slides • 8 Questions
1
Type in the code given
2
HOW LARGE
LANGUAGE
MODEL (LLM)
WORKS
V06
HOW XIAN NENG
JANANEE A/P SUBRAMANIAM
SITI NABILAH BINTI MUHAMAD ZIN
3
ICE-BREAKING
SESSION
“GUESS THE AI”
4
Word Cloud
If you want to ask AI a question, what would it be?
5
INTRODUCTION
Large Language Models (LLMs)
type of artificial intelligence
significantly transformed the landscape of natural
language processing (NLP)
These models are trained on vast amounts of text
data
allowing them to understand, generate, and
manipulate human language with remarkable
proficiency
6
NLP
Text generation, translation, and
sentiment analysis.
MAR
CONVERSATIONAL AI
Powering chatbots and virtual
assistants for natural
interactions.
TRANSFER LEARNING
Fine-tuning for specific tasks,
saving time and resources.
MULTI MODAL
APPLICATIONS
Processing text, images, and
audio for tasks like image
captioning.
DESCRIBE THE USE OF FOUNDATION MODEL OF
LLM
7
CONTENT CREATION
Assisting in writing, generating
articles, and automating
reports
MAR
PERSONALIZATION
Tailoring recommendations
based on user preferences
ACCESIBILITY
Enhancing tools for individuals
with disabilities (e.g., speech-to-
text).
RESEARCH AND
DEVELOPMENT
Supporting innovation in AI
through versatile applications
DESCRIBE THE USE OF FOUNDATION MODEL OF
LLM
8
KEY COMPONENTS OF LARGE LANGUAGE MODELS
(LLMS)
Lorna Alvarado
the process of breaking down text into
smaller units called tokens, which can
be words, subwords, or characters.
the process of converting text tokens
(words, subwords, or characters) into dense
numerical vectors (fixed-length arrays of
numbers).
deep learning model architecture designed
to handle sequential data, particularly well-
suited for natural language processing tasks
TOKENIZATION
EMBEDDING
TRADITIONAL MODEL
ATTENTION
MECHANISM
TRANSFORMER
9
TRADITIONAL MODEL
Attention Mechanism
Tokenization
Word Embedding
TRANSFORMER
10
WHAT
IS?
PURPOSE?
TOKENIZATION?
Tokenization is the process of dividing
text into smaller units called tokens,
which can be words, sub-words, or
characters
Helps machines understand and
process natural language by converting
unstructured text into structured,
analyzable units.
11
TOKENIZATION
How many meaning units are there in the sentence?
Let's learn about tokenisation and how LLM works.
Let's learnabout tokenisationand how LLMworks.
Word-Level Tokenisation
SubWord Level Tokenisation
Let'slearnabout tokenisationand how LLMworks.
12
TRY TO TOKENISE THE FOLLOWING
EXERCISE:
Becky enjoyed teaching three-year-olds.
word: “Becky” “enjoyed” “teaching” “three-
year-olds”
13
Multiple Choice
"Becky enjoyed teaching three-year-olds"
How many sub-words are there in the sentence?
11
12
14
TRY TO TOKENISE THE FOLLOWING
EXERCISE:
Becky enjoyed teaching three-year-olds.
word: “Becky” “enjoyed” “teaching” “three-
year-olds”
sub-word:
"Be", "##cky", "enjoy", "##ed", "teach",
"##ing", "three", "-", "year", "-", "olds"
15
TRY TO TOKENISE THE FOLLOWING
EXERCISE:
Becky enjoyed teaching three-year-olds.
word: “Becky” “enjoyed” “teaching” “three-
year-olds”
sub-word:
"Be", "##cky", "enjoy", "##ed", "teach",
"##ing", "three", "-", "year", "-", "old" “s"
16
Embedding - Converting meaning to numbers
How do you make computer understand meaning?
i) Break down words into “meaning units”
ii) Assign “meaning units” to a number
iii) Put all these numbers into a vector
King: Human, male
Queen: Human, Female
Ox: Animal, male
Cow: Animal Female
King: [0.1, 0.0]
Queen: [0.1, 1.0]
Ox: [0.2, 0.0]
Cow: [0.2, 1.0]
This will enable the machines to group words based on their meaning:
All words that has the meaning of female, just look at the second vector value.
17
Multiple Choice
Which of the following might represent “prince”?
[0.34, 0.12]
[0.25, 0.65]
[0.56, 0.70]
[0.78, 0.21]
18
Is there a problem with this algorithm?
19
The Problem
i) What if the same word has completely different meaning?
“bank”, “bank”
2) What if they are close enough but different?
“kid”, “kid”, “kid”?
20
Solution: Contextualisation
21
WHY CONTEXT
MATTERS:
Words can mean different things in different sentences. For
example, the word “bank” could mean:
A financial institution: “I need to go to the bank.”
The side of a river: “We sat by the river bank.”
22
money
bank
river bank
shore
financial
side of river
Word map
embedding space
embedding space
vectors
vectors
vectors
vectors
vectors
water
23
money
bank
financial
Word map
embedding space
vectors
vectors
money: [0.45, -0.87, 1.02, -1.23, 0.76]
bank: [0.46, -0.89, 1.03, -1.20, 0.74]
Since these two vectors are very close
(the numbers are similar), the AI knows
that "money" and "bank" are related.
You can think of it like a set of
coordinates that help the AI
understand relationships between
words.
vectors
24
How does LLM know what to focus on?
The cat lay on the carpet because it was tired
Are all the words equally important?
Attention Mechanism
Attention Score
25
Reorder
Order the following words according to Attention Score from High to Low for the sentence:
The cat lay on the carpet because it is tired.
When the LLM is trying to figure out the meaning of "it"
Cat
Lay
tired
carpet
because
26
How does Attention Mechanism work?
Think of two sentences that use the same word in different meaning.
27
How does Attention Mechanism work?
Think of two sentences that use the same word in different meaning.
I go to the bank to get some money
I go to the bank to fish
28
How does Attention Mechanism work?
Use AI to learn about AI:
I go to the bank to get some money
I go to the bank to fish
ChatGPT Prompt:
Explain how you differentiate these two sentences using Attention Mechanism?
29
How do LLM predict
Activity:
Think of a word to complete the following sentence that doesn't make sense:
The cat lay on the carpet because it is ________.
30
Ask the AI to explain itself
Activity:
Prompt 1:
Complete the following sentence with a word that doesn't make sense.
Prompt 2:
31
Open Ended
Share your prompt!
32
Ask the AI to explain itself (Use Chatgpt)
Activity:
Prompt 1:
Complete the following sentence with a word that doesn't make sense.
Prompt 2:
How do you use Attention Mechanism to determine that this word doesn't make sense?
33
The LLM doesn’t just guess randomly—it uses everything it knows
to pick the best word based on what’s already been said.
LLMS (LARGE LANGUAGE MODELS)
34
Let’s say you start a sentence: “Once upon a…”
Now, the LLM needs to predict what comes next. To do that, it plays a
word-guessing game!
Look at the context: The LLM sees "Once upon a…" and knows this
usually starts a fairy tale.
1.
Guess the next word: It thinks, “What word would make sense here?”
Words like "time," "princess," or "story" might all be possibilities.
2.
STEP-BY-STEP: HOW THE LLM
PREDICTS THE NEXT WORD
35
4. Assign probabilities: What word to choose next?
For example:
“time” might have a high probability (because “Once upon a time” is a
common phrase).
“princess” might have a lower probability (it could make sense but isn't
as common as "time").
“banana” would have a very low probability (it doesn’t really fit a fairy
tale).
STEP-BY-STEP: HOW THE LLM
PREDICTS THE NEXT WORD
36
5. Pick the word with the highest probability: The LLM chooses
the word that has the highest score—in this case, it will likely choose
“time.”
So, the sentence becomes: “Once upon a time…”
This process continues, word by word, until the sentence is
complete!
THE PROCESS CONTINUES:
37
Imagine LLMs (Large Language Models) read 100 sentences on:
The sky is ____.
90 sentences continues with “blue”, another 10 sentences continue
with “grey”.
HOW LLMS CALCULATES
THE PROBABILITY
(how likely something is to happen.)
38
The cat which was very fluffy jumped over the fence.
The cat jumped over the fence.
How to use LLMs to produce a picture:
39
How to use LLMs to produce a text:
40
Multiple Choice
Imagine you’re writing a magical story with an LLM. You start with:
“The wizard waved his wand and turned the…”
Which of these words do you think the LLM is most likely to choose next?
Castle
Car
41
Multiple Choice
Pretend the LLM is a word detective trying to complete the sentence:
“The dog ran quickly to the…”
The LLM looks at the sentence and guesses the next word. Which of these words would probably get the highest probability?
Park
Kitchen
42
SUMMARY
43
KEY COMPONENTS OF LARGE LANGUAGE MODELS
(LLMS)
Lorna Alvarado
the process of breaking down text into
smaller units called tokens, which can
be words, subwords, or characters.
the process of converting text tokens
(words, subwords, or characters) into dense
numerical vectors (fixed-length arrays of
numbers).
deep learning model architecture designed
to handle sequential data, particularly well-
suited for natural language processing tasks
TOKENIZATION
EMBEDDING
TRADITIONAL MODEL
ATTENTION
MECHANISM
TRANSFORMER
44
To understand human
language
LLMS (LARGE LANGUAGE MODELS)
To produce materials in
natural human language
45
LLMS (LARGE LANGUAGE MODELS)
46
Poll
Do you understand now what is Large Language Models (LLMs)?
Yes
No
Type in the code given
Show answer
Auto Play
Slide 1 / 46
SLIDE
Similar Resources on Wayground
42 questions
i learn smart start 5 - theme 6
Presentation
•
5th Grade
40 questions
Materials Part Two
Presentation
•
5th Grade
39 questions
I Will Survive (Text Structure)
Presentation
•
5th Grade
41 questions
Towers & Buildings
Presentation
•
5th Grade
41 questions
10. INTER FUTURE
Presentation
•
5th Grade
41 questions
Developing Tactics for Listening - Unit 7 - Pages 26-28
Presentation
•
5th Grade
43 questions
Who Wrote the US Constitution Wonders
Presentation
•
5th Grade
41 questions
Non-Fiction Text Types and Features
Presentation
•
5th - 6th Grade
Popular Resources on Wayground
20 questions
"What is the question asking??" Grades 3-5
Quiz
•
1st - 5th Grade
20 questions
“What is the question asking??” Grades 6-8
Quiz
•
6th - 8th Grade
10 questions
Fire Safety Quiz
Quiz
•
12th Grade
20 questions
Equivalent Fractions
Quiz
•
3rd Grade
34 questions
STAAR Review 6th - 8th grade Reading Part 1
Quiz
•
6th - 8th Grade
20 questions
“What is the question asking??” English I-II
Quiz
•
9th - 12th Grade
20 questions
Main Idea and Details
Quiz
•
5th Grade
47 questions
8th Grade Reading STAAR Ultimate Review!
Quiz
•
8th Grade
Discover more resources for English
20 questions
"What is the question asking??" Grades 3-5
Quiz
•
1st - 5th Grade
20 questions
Main Idea and Details
Quiz
•
5th Grade
76 questions
STAAR Mixed Review (Print Review)
Quiz
•
3rd - 7th Grade
35 questions
STAAR Review 5th Reading
Quiz
•
5th Grade
22 questions
ELA Review
Quiz
•
5th Grade
20 questions
Context Clues
Quiz
•
5th Grade
20 questions
READING STAAR REVIEW
Quiz
•
5th Grade
30 questions
5th Grade STAAR Reading
Quiz
•
4th - 5th Grade