by instructa
⚡ Stream browser logs to terminal, zero setup, perfect for Ai Agents
# Add to your Claude Code skills
git clone https://github.com/instructa/browser-echo
browser-echo makes it easy for you to read client-side logs directly in your coding agent during development.
🤖 AI Coding Assistant Support - Perfect for Cursor AI, Claude Code, GitHub Copilot CLI, Gemini CLI, and other code editors that read terminal output
🚀 Framework Support - React, Vue, Nuxt 3/4, Next.js, TanStack Start, Vite-based frameworks, and custom setups
First, set up Browser Echo for your framework:
| Framework | Quick Setup | | --- | --- | | TanStack / Vite | Installation Guide | | Nuxt 3/4 | Installation Guide | | Next.js (App Router) | Installation Guide | | Vue + Vite | Installation Guide | | React + Vite | Installation Guide | | Vue (non-Vite) | Installation Guide | | React (non-Vite) | Installation Guide | | Core | Installation Guide |
⚠️ IMPORTANT: You must complete step 1 (framework setup) first before MCP will work. The MCP server needs your frameworks server to forward browser logs to it.
📖 Set up Browser Echo MCP Server for AI assistant integration

Stream browser logs to your dev terminal and optional file logging.
<details> <summary>The MCP doesn't show any logs</summary>console.*browser-echo makes it easy for you (and your AI coding assistant) to read client-side logs directly in the server terminal during development.
💡 Tip: The MCP server isn't required for most use cases. AI assistants are usually smart enough to read CLI output directly, and the terminal solution is often faster and cheaper than MCP integration. The MCP was designed to avoid polluting the context window, but in most cases the terminal output is sufficient.
console.log/info/warn/error/debugsendBeacon when possible)(file:line:col) + stack tracesNo production impact. Providers enable this across frameworks by injecting a tiny client patch and exposing a dev-only HTTP endpoint.
console.* in prod builds, use your bundler’s strip tools (e.g. Rollup plugin) separately.