Live
Black Hat USAAI BusinessBlack Hat AsiaAI BusinessSources: Amazon is in talks to acquire Globalstar to bolster its low Earth orbit satellite business; Apple's 20% stake in Globalstar is a complicating factor (Financial Times)TechmemeSingle-cell imaging and machine learning reveal hidden coordination in algae's response to light stress - MSNGoogle News: Machine LearningGoogle Dramatically Upgrades Storage in Google AI Pro - Thurrott.comGoogle News: GeminiOpenAI Won't Save ARKK (BATS:ARKK) - seekingalpha.comGoogle News: OpenAIAI Can Describe Human Experiences But Lacks Experience In An Actual ‘Body’ - eurasiareview.comGoogle News: AIAI & Digital Tools on Construction Projects: Contract Risks to Address Before Peak Season - JD SupraGoogle News: AIAI Revolution: Action & Insight - businesstravelexecutive.comGoogle News: Machine LearningNavigating the Challenges of Cross-functional Teams: the Role of Governance and Common GoalsDEV Community[Side B] Pursuing OSS Quality Assurance with AI: Achieving 369 Tests, 97% Coverage, and GIL-Free CompatibilityDEV Community[Side A] Completely Defending Python from OOM Kills: The BytesIO Trap and D-MemFS 'Hard Quota' Design PhilosophyDEV CommunityFrom Attention Economy to Thinking Economy: The AI ChallengeDEV CommunityHow We're Approaching a County-Level Education Data System EngagementDEV CommunityBlack Hat USAAI BusinessBlack Hat AsiaAI BusinessSources: Amazon is in talks to acquire Globalstar to bolster its low Earth orbit satellite business; Apple's 20% stake in Globalstar is a complicating factor (Financial Times)TechmemeSingle-cell imaging and machine learning reveal hidden coordination in algae's response to light stress - MSNGoogle News: Machine LearningGoogle Dramatically Upgrades Storage in Google AI Pro - Thurrott.comGoogle News: GeminiOpenAI Won't Save ARKK (BATS:ARKK) - seekingalpha.comGoogle News: OpenAIAI Can Describe Human Experiences But Lacks Experience In An Actual ‘Body’ - eurasiareview.comGoogle News: AIAI & Digital Tools on Construction Projects: Contract Risks to Address Before Peak Season - JD SupraGoogle News: AIAI Revolution: Action & Insight - businesstravelexecutive.comGoogle News: Machine LearningNavigating the Challenges of Cross-functional Teams: the Role of Governance and Common GoalsDEV Community[Side B] Pursuing OSS Quality Assurance with AI: Achieving 369 Tests, 97% Coverage, and GIL-Free CompatibilityDEV Community[Side A] Completely Defending Python from OOM Kills: The BytesIO Trap and D-MemFS 'Hard Quota' Design PhilosophyDEV CommunityFrom Attention Economy to Thinking Economy: The AI ChallengeDEV CommunityHow We're Approaching a County-Level Education Data System EngagementDEV Community

From Coin Toss to LLM — Understanding Random Variables

DEV Communityby siva1b3April 1, 20266 min read1 views
Source Quiz

<h1> From Coin Toss to LLM — Understanding Random Variables </h1> <p>A beginner friendly guide to probability and random variables — no math background needed.</p> <h2> 1. What is Probability? </h2> <p>Probability is a number that measures how likely something is to happen.</p> <p>This number is always between 0 and 1:</p> <div class="table-wrapper-paragraph"><table> <thead> <tr> <th>Value</th> <th>Meaning</th> </tr> </thead> <tbody> <tr> <td>0</td> <td>Impossible — will never happen</td> </tr> <tr> <td>1</td> <td>Certain — will always happen</td> </tr> <tr> <td>0.5</td> <td>Equal chance — may or may not happen</td> </tr> </tbody> </table></div> <h3> Example </h3> <p>Flip a fair coin. Two outcomes are possible — heads or tails. Neither is more likely than the other.</p> <p>So the probabili

From Coin Toss to LLM — Understanding Random Variables

A beginner friendly guide to probability and random variables — no math background needed.

1. What is Probability?

Probability is a number that measures how likely something is to happen.

This number is always between 0 and 1:

Value Meaning

0 Impossible — will never happen

1 Certain — will always happen

0.5 Equal chance — may or may not happen

Example

Flip a fair coin. Two outcomes are possible — heads or tails. Neither is more likely than the other.

So the probability of heads = 1/2 = 0.5 And the probability of tails = 1/2 = 0.5

One important rule — all probabilities of all possible outcomes must always add up to 1.

0.5 + 0.5 = 1 ✓

What happens in real life?

If you flip a coin 10 times, you might get 6 heads and 4 tails. That is normal.

But if you flip 10,000 times, the result will get very close to 50% heads and 50% tails.

More experiments you run → closer you get to the true probability.

2. What is a Random Variable?

Think of a random variable as an empty slot.

  • Before the experiment — the slot is empty

  • You run the experiment

  • After the experiment — the slot is filled with a number

Why do we need it?

Outcomes are often text — "heads", "tails", "win", "lose". Mathematics works with numbers, not words.

So we assign a number to each possible outcome. This is what a random variable does.

Example — Coin Toss

Before the experiment: Slot is empty. Two possible outcomes exist.

Possible outcome Assigned number

Heads 1

Tails 0

Experiment: Flip the coin.

After the experiment: Landed heads → Slot is filled with 1

New experiment, new slot.

Before: Fresh empty slot.

Experiment: Flip again.

After: Landed tails → Slot is filled with 0

Key point

A random variable is not a fixed number. It can be different every time you run the experiment. That is why it is called a variable.

3. Discrete vs Continuous

Not all random variables behave the same way. There are two types.

Discrete Random Variable

The slot can only be filled with countable, specific values. No values exist in between.

Examples:

  • Dice roll → only 1, 2, 3, 4, 5, 6. There is no 2.5 on a dice.

  • Number of goals in a football match → 0, 1, 2, 3... you cannot score 1.7 goals.

  • Number of students in a classroom → always a whole number.

Continuous Random Variable

The slot can be filled with any value in a range. There are always more precise values possible.

Examples:

  • Weight of a mango → 182g, 182.1g, 182.13g, 182.137g... it never stops.

  • Height of a person → 170.0cm, 170.01cm, 170.001cm...

  • Time taken to run 100 meters → infinite precision possible.

Quick test

Question Answer

Number of eggs in a basket Discrete

Exact temperature of water Continuous

Number of SMS messages sent today Discrete

Height of a building Continuous

4. Real World Examples — Coin, Dice, LLM

Now let us walk through three experiments using the same structure:

  • Start point

  • What was done as the experiment

  • What filled the slot at the end

Experiment 1 — Coin Toss

Start point: Slot is empty. Two possible values — 1 or 0.

Experiment: Flip the coin.

Result: Landed heads → Slot filled with 1

Type: Discrete — only two possible values.

Experiment 2 — Dice Roll

Start point: Slot is empty. Six possible values — 1, 2, 3, 4, 5, 6.

Experiment: Roll the dice.

Result: Landed on 4 → Slot filled with 4

Type: Discrete — six countable values.

Experiment 3 — LLM picks the next word

You type: "The sky is"

Start point: Slot is empty. The LLM has a fixed list of words called a vocabulary — roughly 50,000 words. Always the same list.

Experiment: LLM runs an internal calculation. It assigns a probability to every single word in the vocabulary.

Example result of that calculation:

Possible next word Probability

blue 0.60

clear 0.25

dark 0.10

falling 0.05

... 49,996 more words very small values

All probabilities add up to 1 — same rule as always.

Result: One word is picked based on these probabilities → Slot filled with "blue"

Type: Discrete — vocabulary is a fixed, countable list.

The pattern across all three

Coin Dice LLM

Slot before experiment Empty Empty Empty

Possible outcomes 2 6 50,000 words

Experiment Flip Roll Internal calculation

Slot after experiment 0 or 1 1 to 6 One word

Type Discrete Discrete Discrete

5. How LLM Uses Random Variables

Why does ChatGPT give different answers every time?

Even the word "falling" has probability 0.05 — not zero. So occasionally it gets picked.

The slot can be filled by any outcome. Just some are far more likely than others.

Same prompt → same probabilities → but picking is random → different word each time.

The temperature setting

There is a setting in LLMs called temperature that controls how random the picking is.

Temperature Effect

Low (near 0) Almost always picks the highest probability word — predictable, repetitive

High (1.0+) Picks lower probability words more often — creative, unpredictable

This is the same as controlling how random your experiment is.

Conclusion

You started with a simple coin toss and ended with understanding how a large language model generates text. The same idea runs through all of it.

Three things to remember:

Concept Simple definition

Random Variable An empty slot filled with a number after an experiment

Discrete Slot can only take countable, specific values

Continuous Slot can take any value in a range — infinite precision

Every time an LLM generates a word, it is filling a random variable slot — from a vocabulary of 50,000 words, each with a probability, picked by a calculation.

That is the entire connection — from coin toss to LLM.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by AI News Hub · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

More about

modellanguage modelchatgpt

Knowledge Map

Knowledge Map
TopicsEntitiesSource
From Coin T…modellanguage mo…chatgptDEV Communi…

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 173 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Models