Relay is a production-style, self-hostable LLM environment built on llm.rb. It is both a usable workspace and a reference implementation: a real product that shows how llm.rb can power providers, tools, MCP servers, attachments, saved contexts, and streaming conversations in one application.
Relay is meant to show what llm.rb can power in a real application:
- multi-provider conversational products
- persistent contexts and long-lived sessions
- built-in tools and MCP-backed capabilities
- streaming interfaces
- cost and context-window visibility
| llm.rb capability | How Relay uses it |
|---|---|
LLM::Context |
Saved conversations and long-lived chat sessions |
| Provider abstraction | Provider and model switching in the UI |
| Tools runtime | Built-in tools and local tool execution |
| MCP integration | External capability and server integration |
| Streaming/context execution | Live streamed responses over the chat UI |
| Cost and context tracking | Sidebar and status indicators for usage and context budget |
Relay demonstrates that llm.rb can support:
- multi-provider conversational applications
- persistent session and context management
- user-visible tools and MCP servers
- streaming user interfaces
- long-context workflows
- operational features like cost and context-window tracking
Relay is a good fit if you want to:
- use a self-hosted workspace for multi-provider LLM work
- connect models to local tools and MCP servers
- keep long-lived conversations with saved contexts
- compare providers and models in one interface
- fork a real application as the base for your own llm.rb product
If you just want Relay running locally, this is the shortest path.
Requirements
- Ruby
- Node.js
- Webpack
- SQLite
1. Install dependencies
bundle install2. Configure secrets
Create a .env file:
OPENAI_SECRET=...
GOOGLE_SECRET=...
ANTHROPIC_SECRET=...
DEEPSEEK_SECRET=...
XAI_SECRET=...
SESSION_SECRET=...
REDIS_URL=You only need provider secrets for the providers you plan to use.
3. Set up the database
bundle exec rake db:setup
bundle exec rake db:seedThe seed creates a default local user:
- email:
0x1eef@hardenedbsd.org - password:
relay
Change the seeded values in db/seeds.rb first if you do not want those defaults.
4. Start Relay
bundle exec rake dev:startThen open Relay in your browser and sign in with the seeded account.
During development, Relay enables Zeitwerk reloading and refreshes
autoloaded constants between requests, so changes under app/ are
picked up without restarting the web server.
- Streaming chat over WebSockets with server-rendered updates
- Multiple provider support: OpenAI, Google, Anthropic, DeepSeek, and xAI
- Saved chat contexts with provider-aware switching and new-context creation
- Attachment support for providers that accept local files through
llm.rb - Built-in tool support plus automatic loading of custom tools from app/tools/
- User-managed MCP server integration from the Relay web UI
- Session-backed sign-in and per-user persistent context
- A jukebox sidebar with tool-driven playlist management
- Rack application built with Falcon, Roda, and async-websocket
- Sequel models and migrations for application state
- Sidekiq workers for background jobs
- A built-in task monitor for the local development stack: web, workers, and assets
- Session support through Roda's session plugin
- In-memory shared state via
Relay.cache - Automatic
.envloading during boot - Zeitwerk hot reloading in development
Relay is both a usable product and a reference implementation. If you want the internal layout, routing model, concerns, boot sequence, and test structure, see resources/architecture.md.
Tools
Relay ships with built-in tools in app/tools/:
create_image.rbgenerates imagesrelay_knowledge.rbexposes project documentationjuke_box.rbreads from the built-in playlistadd_song.rbadds songs to the jukebox playlistremove_song.rbremoves songs from the jukebox playlistapropos.rbsearches FreeBSD man pages withapropos
These tools serve as examples of how to extend Relay's behavior. They show common patterns such as calling external providers, editing local application data, returning documentation-backed knowledge, invoking system commands, and rendering structured tool output in the interface.
To add your own behavior, create additional tools under app/tools/.
Relay loads registered tools automatically, so new tools become
available to the model alongside the built-in ones.
MCP
Relay includes an MCP management page so each user can configure their own stdio MCP servers from the web UI.
This first version keeps the setup intentionally small:
nameanddescriptionfor displayargvfor the command to launch- optional
cwd - optional
enventries asKEY=value - an
enabledflag to control whether the server is active in chat
To add a server, install the MCP binary you want to run, open the MCP management page in Relay, and save the stdio launch configuration there.
Once configured, Relay starts the MCP servers for the chat session and adds their tools to the available tool list.
