Mero-Lagani is an automated market tracker that keeps an eye on company listings and market activity so you don’t have to. It collects and updates financial data in the background, saving you from checking multiple sites manually.
The system uses browser automation to fetch fresh data, stores it in Redis for fast access, and refreshes everything daily. It’s built to stay fast, reliable, and easy to scale as it grows.
This project can be run locally without using Docker. Follow the steps below to start both the backend services.
Install dependencies and start the Django server:
uv sync
uv run python manage.py runserver
This project is an automated system designed to scrape IPO data from MeroShare, cache it for high-performance access, and notify users of new opportunities.
-
Scraper (Selenium):
- Located in
crawler/services/meroshare.py. - Uses headless Chrome to log in to MeroShare and extract the "My ASBA" list.
- Located in
-
Scheduler:
- Located in
crawler/services/scheduler.py. - Runs a background thread that triggers the synchronization command every 60 minutes.
- Located in
-
Data Storage & Caching:
- Redis (Cache): Stores the full list of IPOs as a JSON string for the external Go API to consume.
- Redis (Deduplication): Uses a Redis Set (
seen_ipos) to track which IPOs have already been processed to prevent duplicate emails. - SQLite (Database): Acts as a persistent log for the admin panel.
-
Asynchronous Notifications (Celery):
- Located in
crawler/tasks.py. - When a new IPO is detected, the
sync_iposcommand triggers a Celery task. - The Celery worker processes this task in the background, sending emails via SMTP.
- Located in
- Trigger: Scheduler (or Admin API) calls
sync_ipos. - Scrape: Selenium fetches current data.
- Process:
- API Cache (
ipo_list) is overwritten with fresh data. - Each IPO is checked against the Redis Set
seen_ipos.
- API Cache (
- Notify: If an IPO is new (not in Set), a Celery task is queued.
- Deliver: Celery worker picks up the task and sends emails.
uv run python manage.py createsuperuser
To manually check for new IPOs and send notifications:
uv run python manage.py sync_ipos
To clear the local IPO database (reset "new" detection):
uv run python manage.py clear_ipos
The email notifications are now handled asynchronously by Celery.
To run the worker locally (without Docker):
# Make sure you have a Redis instance running on port 6379
uv run celery -A config worker --loglevel=infoWhen using Docker, the celery container starts automatically.
Navigate to the Go API directory and run the server:
cd go-api
go run main.go
Both services should now be running locally.
This project can be run locally using Docker. Follow the instructions below to manage backend services.
Build all Docker images before starting the containers.
docker-compose buildStart all services defined in docker-compose.yml.
docker-compose upRemoves all IPO records stored in Redis.
docker-compose exec web uv run python manage.py clear_ipos --forceFetches the latest IPOs and updates the Redis cache.
docker-compose exec web uv run python manage.py sync_ipos