Compatibility Check
Can I Run Mistral Small 24B on NVIDIA GeForce RTX 4090?
Yes — NVIDIA GeForce RTX 4090 runs Mistral Small 24B fully on GPU at the Q4_K_M quantization.
Estimated ~57.6 tokens/sec on the Q4_K_M quantization.
Full GPU
Best variant: Q4_K_M
Full GPU inference — 24 GB VRAM meets the 20 GB recommendation.
- GPU VRAM
- 24 GB
- Min VRAM (best fit)
- 16 GB
- Recommended VRAM
- 20 GB
- Estimated tok/s
- ~57.6
Share this matchup
Send this page so a friend can see if NVIDIA GeForce RTX 4090 fits Mistral Small 24B.
Every Mistral Small 24B quantization on NVIDIA GeForce RTX 4090
Each row runs the compatibility engine against your VRAM, RAM, and the model's requirements.
| Quantization | File Size | Min VRAM | Rec VRAM | Context | Verdict | Estimated tok/s |
|---|---|---|---|---|---|---|
| Q4_K_MBest fit | 14 GB | 16 GB | 20 GB | 8K / 32K | Full GPU | ~57.6 |
| Q8_0 | 25 GB | 27 GB | 32 GB | 8K / 32K | Hybrid CPU+GPU | ~12 |
NVIDIA GeForce RTX 4090 is solid pick for Mistral Small 24B
Need second card or fresh build? These links help support site at no extra cost.