Skip to main content

Install LM Studio

Download LM Studio from lmstudio.ai. It's available for macOS, Windows, and Linux. The installer handles everything — no dependencies needed.

Browse and Download Models

Use the built-in model browser to search for models. LM Studio shows VRAM requirements and automatically selects the best quantization for your hardware. Click "Download" on any model to get started.

Chat with a Model

Select a downloaded model from the sidebar, then start chatting. LM Studio provides a clean chat interface with system prompt configuration, temperature controls, and context length settings.

Start the Local Server

Click "Local Server" in the left sidebar, then "Start Server". This exposes an OpenAI-compatible API at http://localhost:1234/v1. You can use this with any tool that supports the OpenAI API — including OpenClaw, Cursor, and Continue.

# Test the LM Studio API
curl http://localhost:1234/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "local-model",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Recommended Models for LM Studio

← All Guides