VRAM

Module: tool mastery

What it is

VRAM (Video Random Access Memory) is the memory on your graphics card. Running AI models locally requires loading the model into VRAM. Larger models need more VRAM. A 7B model might need 4-8GB; a 70B model might need 40GB+ depending on quantisation.

Why it matters

VRAM is often the limiting factor for running models locally. Before downloading a large model, check if it fits in your VRAM. Consumer GPUs typically have 8-24GB; professional cards have more. Understanding VRAM requirements helps you choose which models you can actually run.