A lightweight, self-hosted EPUB reader with integrated AI analysis capabilities.
- ๐ Clean Layout - Three-column design (TOC, Content, AI Panel)
- ๐ Sticky Navigation - Top navigation bar stays visible while scrolling
- โจ๏ธ Keyboard Shortcuts - Arrow keys for prev/next chapter, ESC to close panels
- ๐ Internal Links - Footnotes and author comments open in modal popups
- ๐ฏ Clickable Covers - Click book covers to start reading instantly
- ๐ค AI Analysis - Right-click on text for fact-checking or discussion (Ollama local or Cloud)
- ๏ฟฝ *Personal Comments - Add your own notes without AI (no API cost)
- ๐พ Manual Save - Choose what to save to avoid clutter
- โจ Color-Coded Highlights - Yellow (fact check), Blue (discussion), Green (comments)
- ๐ท๏ธ Smart Tooltips - Hover over highlights to see type
- ๐๏ธ Edit & Delete - Manage all your highlights and comments
- ๐จ Markdown Support - AI responses render with proper formatting
- ๐ Highlights View - See all your notes and analyses for each book
- ๐ค Export to Markdown - Export highlights with AI context warnings
- ๐ Web Upload - Upload EPUB files via click or drag & drop
- ๐ผ๏ธ Cover Images - Automatic cover extraction and display
- ๐ Search - Find books by title or author
- ๐๏ธ Organized Storage - All books in
books/directory, data in SQLite
Edit .env file:
# Ollama
OLLAMA_BASE_URL=http://localhost:11434/v1
OLLAMA_API_KEY=ollama
OLLAMA_MODEL=llama3
OLLAMA_CLOUD_MODEL=gpt-oss:120b-cloudThen sign your Ollama daemon into Ollama Cloud once:
ollama signinOption A: Upload via Web Interface (Recommended)
- Start server:
uv run server.py - Open http://127.0.0.1:8123
- Click the "+" card OR drag & drop EPUB file
- Wait for automatic processing
Option B: Command Line
uv run reader3.py your_book.epubuv run server.pyThe server listens on 0.0.0.0:8123 by default so other devices on your LAN can reach it.
You can override that with:
READER_HOST=0.0.0.0 READER_PORT=8123 uv run server.py- Open http://127.0.0.1:8123
- Select a book
- Right-click on text โ Choose analysis type
- Review AI response in side panel
- Save if important
- Highlights appear on next visit!
- Select text โ Right-click โ Choose:
- ๐ Fact Check - Verify facts and get context
- ๐ก Discussion - Deep analysis and insights
- ๐ฌ Add Comment - Your personal notes (no AI)
- View response in right panel
- Click "Save" for important insights
- Yellow - Fact checks
- Blue - Discussions
- Green - Your comments
- Hover to see type, click to view/edit
- All highlights are editable and deletable
- Click โฎ menu on any book โ "View Highlights"
- See all your notes and analyses in one page
- Filter by type (Fact Check, Discussion, Comment)
- Export to markdown for AI processing
- Context length warnings for large exports
- Jump directly to any chapter
- โ โ - Navigate between chapters
- ESC - Close panels and modals
- Works anywhere except when typing in text fields
reader3/
โโโ reader3.py # EPUB processor
โโโ server.py # Web server
โโโ database.py # SQLite operations
โโโ ai_service.py # AI integration
โโโ books/ # All book data here
โ โโโ book_name_data/
โ โโโ book.pkl
โ โโโ images/
โโโ templates/ # HTML templates
โโโ reader_data.db # SQLite database
โโโ .env # API configuration
- Click โฎ menu on any book โ "View Highlights"
- See all notes, comments, and analyses in one page
- Filter by type and jump to chapters
uv run check_database.py# Double-click: backup.bat
# Or manually:
copy reader_data.db backups\reader_data_backup.dbcheck_database.py- View raw database contents (advanced)backup.bat- Quick database backup
- โ Uses the same Ollama workflow as local models
- โ Lets you use larger hosted models without a local GPU
- โ Keeps one provider for both local and cloud modes
- โ Works through Ollama's OpenAI-compatible endpoint
- Check
.envfile exists and has correct key - Restart server
- Check browser console (F12) for errors
- Verify data exists:
uv run check_database.py - Hard refresh (Ctrl+Shift+R)
- Check if port 8123 is available
- Verify
.envconfiguration
This repo includes a systemd unit template and installer so the app can start on boot.
The installed service runs the app with uv run server.py, matching the normal development command.
uv syncsudo ./scripts/install-systemd-service.shThis installs deploy/reader3.service, enables it, and starts it immediately.
systemctl status reader3.serviceIf you use UFW:
sudo ufw allow 8123/tcpThen browse to http://<your-linux-machine-ip>:8123 from another device on your home network.
hostname -IMIT
Note: This project is designed to be simple and hackable. Ask your LLM to modify it however you like!