Full-stack AI chatbot with persistent conversations, multi-agent routing, and request-level debugging via Inspector. Built for applied AI systems requiring observability, prompt routing, and post-hoc analysis.
Live demo: chatbot-with-memory-ochre.vercel.app
- Multi-Agent Routing — Prompt routing per conversation by role (Sales/Support/Engineering/Executive)
- Commerce Agent (MCP Tools) — Shopify product search + Stripe Checkout (test mode) via MCP, with tool trace captured per message in Inspector. MCP server: https://mcp-partner-integration-demo.vercel.app
- Persistent Memory — Postgres-backed conversations and message history
- Inspector Drawer — Request ID, latency, model, and token usage per response
- Structured Logging — Request tracing with
request_idcorrelation to Vercel logs - Markdown Rendering — GFM support with HTML sanitization for assistant messages
Main interface — multi-agent chat with persistent history and one-click Inspector.
Tip: Cmd/Ctrl+K opens the command palette
- Multi-Agent Routing → Prompt routing per org role enables specialized responses (sales discovery, support troubleshooting, engineering tradeoffs)
- Inspector → Request tracing via
request_idenables production debugging and latency analysis - Persisted Metadata → Post-hoc analysis of token usage, response times, and model performance over time
- Select "Engineering" agent → Click "New Chat"
- Send message: "Explain tradeoffs of SSE vs WebSockets"
- Click ⓘ icon on assistant response → Inspector opens
- View metadata: duration, request ID, model, token usage
- Refresh page → conversation and metadata persist
- Press Cmd/Ctrl+K → command palette opens
flowchart LR
U[User / Browser] -->|HTTPS| UI
subgraph UI["Next.js App (Vercel)"]
direction LR
C["Client UI<br/>(Chat + Sidebar + Inspector)"] --> API["App Router API Routes<br/>(Node.js Serverless)"]
MW["Middleware<br/>Request ID"] --> API
API --> LOGS["Structured Logs<br/>(Vercel Logs)"]
end
API --> DB[("Neon Postgres<br/>Drizzle ORM<br/>(messages + meta)")]
API --> LLM["OpenAI API"]
subgraph MCPFLOW["Tool flow (MCP)"]
direction LR
API --> MCPCLIENT["MCP Client<br/>(JSON-RPC over HTTP/SSE)"]
MCPCLIENT --> MCPSERVER["MCP Server<br/>(Vercel)"]
MCPSERVER --> SHOPIFY["Shopify API"]
MCPSERVER --> STRIPE["Stripe Checkout<br/>(Test mode)"]
MCPSERVER -->|tool results| MCPCLIENT
end
DB -->|messages + meta| C
Tech Stack:
- Next.js App Router (TypeScript)
- Neon Postgres
- Drizzle ORM
- OpenAI API
- Vercel (deployment)
- Request ID middleware + structured logs
cp .env.example .env
# Edit .env: DATABASE_URL, OPENAI_API_KEY
# Note: Copy .env.example → .env (never commit .env)
npm install
npm run db:push
npm run devTroubleshooting: If db:push fails, verify DATABASE_URL uses Neon pooler URL (?sslmode=require).
- Env: set
COMMERCE_ENABLED=trueandMCP_SERVER_URLto your MCP server URL (if it lacks/mcp, the client appends it). - MCP server: https://mcp-partner-integration-demo.vercel.app (the client appends /mcp if missing)
- Script:
- Select Commerce agent.
- Send:
search hoodies under $80. - Click:
▶ Buy item 1 (qty 1). - Complete Stripe test payment → redirect back shows the success banner.
- Open Inspector → Tool Trace +
request_id.
- Push to GitHub and import in Vercel
- Set environment variables:
DATABASE_URL,OPENAI_API_KEY - Deploy (API routes use
runtime = "nodejs") - Run migrations:
vercel env pull .env.production.local --environment=production export $(grep "^DATABASE_URL=" .env.production.local | xargs) npm run db:push
- Request ID: Generated by middleware, included in
X-Request-IDresponse header - Inspector: View
request_id,durationMs,model, andusageper assistant message - Vercel Logs: Search by
request_idin Dashboard → Project → Logs - Persisted Metadata: Stored in
messages.metaJSONB column (durationMs, requestId, agentId, model, usage)
GET /api/health- Health check with DB connectivityPOST /api/conversations- Create conversation (optionalagent_id)GET /api/conversations- List all conversationsGET /api/conversations/:id- Get conversation with messagesPOST /api/conversations/:id/messages- Send message + get AI responseDELETE /api/conversations/:id- Delete conversationDELETE /api/conversations- Clear all conversations
All responses include X-Request-ID header.
.envfiles are gitignored- Run
npm run verify:secretsto scan for accidental secret commits - Never commit API keys or database URLs
npm run typecheck # TypeScript validation
npm run lint # ESLint
npm run build # Production build
npm run smoke # End-to-end smoke testsMIT — see LICENSE.



