Back to catalog

deep-research

by u14app

Pending

Use any LLMs (Large Language Models) for Deep Research. Support SSE API and MCP server.

4,372stars
1,055forks
JavaScript
Added 12/27/2025
MCP Serversanthropicdeep-researchdeep-research-apideepresearchdeepseekgeminigrokmcp-serverollamaopenai
Installation
# Add to your Claude Code skills
git clone https://github.com/u14app/deep-research
README.md
<div align="center"> <h1>Deep Research</h1>

GitHub deployments GitHub Release Docker Image Size Docker Pulls License: MIT

Gemini Next Tailwind CSS shadcn/ui

Vercel Cloudflare PWA

Ask DeepWiki

</div>

Lightning-Fast Deep Research Report

Deep Research uses a variety of powerful AI models to generate in-depth research reports in just a few minutes. It leverages advanced "Thinking" and "Task" models, combined with an internet connection, to provide fast and insightful analysis on a variety of topics. Your privacy is paramount - all data is processed and stored locally.

✨ Features

  • Rapid Deep Research: Generates comprehensive research reports in about 2 minutes, significantly accelerating your research process.
  • Multi-platform Support: Supports rapid deployment to Vercel, Cloudflare and other platforms.
  • Powered by AI: Utilizes the advanced AI models for accurate and insightful analysis.
  • Privacy-Focused: Your data remains private and secure, as all data is stored locally on your browser.
  • Support for Multi-LLM: Supports a variety of mainstream large language models, including Gemini, OpenAI, Anthropic, Deepseek, Grok, Mistral, Azure OpenAI, any OpenAI Compatible LLMs, OpenRouter, Ollama, etc.
  • Support Web Search: Supports search engines such as Searxng, Tavily, Firecrawl, Exa, Bocha, Brave, etc., allowing LLMs that do not support search to use the web search function more conveniently.
  • Thinking & Task Models: Employs sophisticated "Thinking" and "Task" models to balance depth and speed, ensuring high-qu...