Back to features
Local & Self-Hosted Models
You can run OpenClaw against Ollama, LM Studio, or any OpenAI-compatible endpoint. Local models trade some convenience for privacy, cost control, and deployment flexibility.
Core Capabilities
- Ollama and LM Studio support out of the box.
- Any OpenAI-compatible endpoint can work.
- Model fallback chains improve reliability.
- Per-skill model routing helps control cost.
Related Resources