MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
# Add to your Claude Code skills
git clone https://github.com/PleasePrompto/google-ai-mode-mcpFor: All MCP-compatible LLMs (Claude, Cursor, Cline, Windsurf, Zed, etc.)
Transform your LLM's online research capabilities by connecting it directly to Google's AI Mode—getting AI-synthesized answers from 100+ sources instead of scattered search results.
Most built-in web research is mediocre. This MCP server gives any LLM professional-grade research by tapping into Google's AI Mode—the same technology that synthesizes information from dozens of websites into one cited answer.
Example Use Cases:
"Next.js 15 App Router best practices 2026 with server components examples"
→ AI-synthesized coding guide with inline citations [1][2][3]
"Compare PostgreSQL vs MySQL JSON performance 2026, include benchmarks"
→ Technical comparison table with real-world data
"Find the latest EU AI regulations 2026 and their impact on startups"
→ Legal overview with official government sources
"Best noise-cancelling headphones under €300, compare Sony vs Bose"
→ Product comparison with reviews and specs
"Intermittent fasting protocols 2026, include recent scientific studies"
→ Health guide with medical research citations
No comments yet. Be the first to share your thoughts!
Result: Research on ANY topic—coding, tech comparisons, regulations, product reviews, health, finance, travel. Curated answers with sources. Saves tokens. Superior to generic web search.
v2.0 - Multi-Language & Detection Overhaul
✅ 4-Stage Completion Detection - SVG thumbs-up → aria-label → text → 40s timeout ✅ Multi-Language Support - Works in DE/EN/NL/ES/FR/IT browser locales ✅ 87% Faster - Average 4s detection (was 2s fixed wait) ✅ AI Mode Availability Check - Detects region restrictions with proxy suggestion ✅ 17 Citation Selectors - Language-agnostic fallback chain ✅ 15 Cutoff Markers - Cleaner content extraction across languages
v1.5 - Persistent browser context, CAPTCHA handling improvements v1.0 - Initial MCP server release
An MCP server that connects your code agent (Claude, Cursor, Cline, etc.) to Google AI Mode—Google's AI-powered search that synthesizes information from dozens of web sources into a single, cited answer.
Instead of your agent reading page after page, Google does the heavy lifting. Your agent gets one clean, structured response with inline citations.
The advantage: Free, token-efficient research with grounded sources.
Your agent asks a question
↓
Server launches stealth browser
↓
Google AI Mode searches & synthesizes dozens of sources
↓
Server extracts AI answer + citations
↓
Converts to clean Markdown with [1][2][3] references
↓
Your agent receives final answer (optionally saved to .md file)
The key difference:
Traditional web research:
With this server:
Google AI Mode (the udm=50 parameter) makes Google search work like a research assistant. It:
Your agent gets the benefits without doing the work—or burning the tokens
Works with any MCP-compatible code agent. Choose your setup:
Claude Code:
claude mcp add google-ai-search npx google-ai-mode-mcp@latest
Codex:
codex mcp add google-ai-search -- npx google-ai-mode-mcp@latest
Linux/WSL users on Codex: If you get a "Missing X-Server" error when trying to show the browser for CAPTCHA solving, use xvfb-run:
{
"mcpServers": {
"google-ai-search": {
"command": "xvfb-run",
"args": ["-a", "npx", "google-ai-mode-mcp@latest"]
}
}
}
Install xvfb if needed: sudo apt-get install xvfb
Cline:
cline mcp add google-ai-search -- npx google-ai-mode-mcp@latest
Gemini:
gemini mcp add google-ai-mode npx -y google-ai-mode-mcp@latest --scope user
VS Code:
code --add-mcp '{"name":"google-ai-search","command":"npx","args":["google-ai-mode-mcp@latest"]}'
Cursor, Windsurf, Zed, or other MCP clients:
Add to your MCP config file:
{
"mcpServers": {
"google-ai-search": {
"command": "npx",
"args": ["google-ai-mode-mcp@latest"]
}
}
}
Cursor uses ~/.cursor/mcp.json, Windsurf and Zed have their own settings files. Check your agent's documentation for the config location
Ask your agent naturally:
"Search Google AI Mode for: Next.js 15 App Router best practices"
"What are the new features in Astro 4.0?"
"Research React Server Components and save the results"
The agent will automatically use the MCP server to query Google AI Mode and return a clean, cited answer.
To save results to a file:
"Search for TypeScript 5.4 features and save it"
Files are saved to platform-specific locations:
~/.local/share/google-ai-mode-mcp/results/~/Library/Application Support/google-ai-mode-mcp/results/%LOCALAPPDATA%\google-ai-mode-mcp\results\Filenames: 2026-01-04_15-30-45_typescript_5_4.md
On your first query, Google may show a CAPTCHA to verify you're human. This is normal when the browser profile is created.
If you see a CAPTCHA error:
After the first CAPTCHA, searches typically run smoothly. The server uses stealth techniques and a persistent browser profile to minimize future CAPTCHAs
Repeated CAPTCHAs:
If Google keeps showing CAPTCHAs:
Browser won't launch:
Clear the browser profile:
# Linux/macOS
rm -rf ~/.local/share/google-ai-mode-mcp/chrome_profile
# Windows
rmdir /s "%LOCALAPPDATA%\google-ai-mode-mcp\chrome_profile"
Wrong language results:
The server forces English results. If you still get wrong languages, clear the profile (see above).
Missing citations:
Update to the latest version:
npm update -g google-ai-mode-mcp
The server works out of the box. Advanced users can customize via environment variables:
# Browser settings
export GOOGLE_AI_HEADLESS=true # Run browser invisibly
export GOOGLE_AI_STEALTH_ENABLED=true # Use anti-detection techniques
# Timeouts
export GOOGLE_AI_RESPONSE_TIMEOUT=30000 # 30 seconds to get AI response
export GOOGLE_AI_CAPTCHA_TIMEOUT=300000 # 5 minutes to solve CAPTCHA
# CAPTCHA handling
export GOOGLE_AI_CAPTCHA_POLL_INTERVAL=3000 # Check every 3 seconds
export GOOGLE_AI_CAPTCHA_MAX_CONSECUTIVE=3 # Restart after 3 CAPTCHAs
export GOOGLE_AI_CAPTCHA_COOLDOWN_MS=30000 # 30 second cooldown
See .env.example for all options.
The server exposes one tool: search_ai
Parameters:
query (required) - Your search questionheadless (optional) - Run browser invisibly (default: true)timeout_ms (optional) - Request timeout in milliseconds (default: 120000)save_to_file (optional) - Save result to .md file (default: false)filename (optional) - Custom filename without .md extensionReturns:
{
"success": true,
"markdown": "# AI response with citations [1][2]\n\nSources:\n[1] [Title](url)",
"sources": [
{ "title": "Source Title", "url": "https://example.com", "domain": "example.com" }
],
"query": "Your query",
"savedTo": "/path/to/results/file.md"
}
Usage examples:
Basic search:
{ "query": "Rust async patterns 2026" }
Save to file:
{ "query": "Next.js 15 features", "save_to_file": true, "filename": "nextjs-15-guide" }
Visible browser (for CAPTCHA or debugging):
{ "query": "PostgreSQL optimization", "headless": false }
You need to implement OAuth2 in a framework you've never used before.
Traditional approach: