Parameters
Module: fundamentals
What it is
Parameters are the learnable values (weights) within a neural network that get adjusted during training. When you hear "GPT-4 has over a trillion parameters" or "Llama 70B," these numbers refer to how many adjustable values the model contains. More parameters generally mean more capacity to learn patterns.
Why it matters
Parameter count is a rough proxy for model capability—larger models tend to be more capable but require more computing power to run. A 7B (7 billion) parameter model can run on consumer hardware, while 70B+ models need powerful servers. This is why there's a trade-off between model capability and accessibility.