🤖 Desktop AI Assistant with Local Model Support - OpenAI SDK compatible agent with memory system, file operations, web search, and modern UI. Supports vLLM, Ollama, Qwen, Llama, and more.
✅ Request Timeouts — 5-minute timeout with auto-retry for LLM requests
✅ Session Logging — full request/response JSON logs per iteration in ~/.localdesk/logs/sessions/
🤔 Why LocalDesk?
Open Architecture & Full Control
LocalDesk isn't just another AI assistant — it's a framework you own. Built with TypeScript and Electron, every component is transparent and modifiable:
Readable codebase — well-structured, documented code you can understand
Easy customization — add new tools, modify prompts, change UI without black boxes
Your rules — adjust behavior, safety limits, and workflows to match your needs