by assafelovic
MCP server for enabling LLM applications to perform deep research via the MCP protocol
# Add to your Claude Code skills
git clone https://github.com/assafelovic/gptr-mcpWhile LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
https://github.com/user-attachments/assets/ef97eea5-a409-42b9-8f6d-b82ab16c52a8
Want to use this with Claude Desktop right away? Here's the fastest path:
Install dependencies:
git clone https://github.com/assafelovic/gptr-mcp.git
pip install -r requirements.txt
Set up your Claude Desktop config at ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"gptr-mcp": {
"command": "python",
"args": ["/absolute/path/to/gpt-researcher/gptr-mcp/server.py"],
"env": {
"OPENAI_API_KEY": "your-openai-key-here",
"TAVILY_API_KEY": "your-tavily-key-here"
}
}
}
}
Restart Claude Desktop and start researching! 🎉
For detailed setup instructions, see the full Claude Desktop Integration section below.
research_resource: Get web resources related to a given task via research.deep_research: Performs deep web research on a topic, finding the most reliable and relevant informationquick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more herewrite_report: Generate a report based on research resultsget_research_sources: Get the sources used in the researchget_research_context: Get the full context of the research