Local LLM Hardware Checker
Find the best local AI model for your hardware
Run AI privately on your own machine. Check your hardware, pick a model, and follow a setup guide—all in minutes.
Choose a goal
What do you want AI to do?
Check hardware
Can your GPU handle it?
Pick a model
Right model for your specs
Follow a guide
Set up and start running
Check your hardware compatibility
We detect your GPU and RAM automatically. Edit values if needed, then see which models fit.
Detected hardware
GPU
Not detected
System RAM
Unknown
CPU Cores
Unknown
Popular starter models
Most beginners start with Ollama + Llama 3.1 8B.
Llama 3.1 8B
Best all-around local model for most laptops and desktops
Gemma 4 E4B
Small multimodal upgrade with strong on-device quality
DeepSeek R1 7B
Reasoning-first pick with good budget hardware fit
Llama 4 Scout 17B
Higher-quality option for users with mid/high VRAM GPUs
AI Agent
Run OpenClaw on your hardware
Check GPU, VRAM, and RAM requirements for OpenClaw and find compatible local models for your PC or Mac.