
Ollama
Integrate Ollama's local LLM models for secure, on-premise AI and data control with MCP-compatible apps. Deploy custom m
Installation
Paste the snippet below into your client's config.
Installation
Quick info
- Category
- AI & Machine Learning
- Transport
- STDIO
- License
- AGPL-3.0
- GitHub stars
- 154
- Views
- 2,179
Compatible with
- Claude Code
- Claude Desktop
- codex
- Cursor
- gemini-cli
- VS Code
- Windsurf
Found in: mcp.directory
Overview
Integrates Ollama's local LLM models with MCP-compatible applications, enabling on-premise AI processing and custom model deployment while maintaining data control.
Related servers
Official

Tripo 3D
by vast-ai-research
180Claude CodeClaude Desktopcodex

arXiv MCP Server
by blazickjp
2,608Claude DesktopCursorVS Code

Createve.AI Nexus
by spgoodman
80,527Claude CodeClaude Desktopcodex

Unity
by ivanmurzak
2,393Claude DesktopCursorVS Code

Penpot
by montevive
228Claude CodeClaude Desktopcodex
Official

Knowledge Graph Memory
by anthropic
84,672Claude CodeClaude Desktopcodex