Library
5 min read
·┌──────────────────────────────────────────────────────────┐ │ ═══════════════════════════════════════════════════ │ │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ │ │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ │ │ ──────────────────────────────────────────────────── │ │ ██████████████████████████░░░░░░░░░░░░░░░░░░░░░░░░░ │ │ █████████████████████████████████░░░░░░░░░░░░░░░░░░ │ │ ██████████████████████████████████████░░░░░░░░░░░░░ │ │ ████████████████░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ │ │ ──────────────────────────────────────────────────── │ │ ███████████████████████████████████████░░░░░░░░░░░░ │ └──────────────────────────────────────────────────────────┘
Parameters are the internal settings that AI models use to make predictions. They're like the knobs and dials that control how the model behaves.
Parameters are numbers stored in the model that determine how it processes information. During training, these numbers are adjusted so the model learns patterns from data.
[Think of it like]: The weights in a neural network that determine how strongly different connections influence the output.
Models are often described by their parameter count:
[More parameters] generally means:
[During training]: Parameters are adjusted to minimize errors [During inference]: Parameters are fixed—they determine how the model responds
The model uses these parameters to:
[Capability]: More parameters can store more knowledge and patterns [Cost]: Larger models cost more to train and run [Speed]: Larger models are slower [Requirements]: Need more powerful hardware
[GPT-3.5]: ~175 billion parameters [GPT-4]: Estimated in the trillions (exact number not disclosed) [Claude 3]: Estimated hundreds of billions [Smaller models]: 7B, 13B, 70B parameters
[Parameters]: Internal model settings (fixed after training) [Tokens]: Units of text processed (varies per request)
Don't confuse these! Parameters are about the model's size, tokens are about how much text you're processing.
Most users don't need to worry about parameters directly. What matters is:
When choosing a model:
Parameters are the technical foundation of how AI models work, but for most practical purposes, you care more about what the model can do than how many parameters it has.