Skip to main content
Full GPU

Best variant: Q4_K_M

Full GPU inference — 32 GB VRAM meets the 24 GB recommendation.

GPU VRAM
32 GB
Min VRAM (best fit)
22 GB
Recommended VRAM
24 GB
Estimated tok/s
~8

Share this matchup

Send this page so a friend can see if Apple M2 Pro fits CodeLlama 34B.

Every CodeLlama 34B quantization on Apple M2 Pro

Each row runs the compatibility engine against your VRAM, RAM, and the model's requirements.

QuantizationFile SizeMin VRAMRec VRAMContextVerdictEstimated tok/s
Q4_K_MBest fit20 GB22 GB24 GB4K / 16KFull GPU~8

Apple M2 Pro is solid pick for CodeLlama 34B

Need second card or fresh build? These links help support site at no extra cost.

All hardware for CodeLlama 34BBest GPU for CodeLlama 34BModels that fit Apple M2 ProFull model detailsBrowse all models