Automated mirroring of GitHub releases and direct URLs.
Every asset is wrapped inside a .zip container (optional compression) and split into ⩽ 99 MB volumes so they never exceed GitHub’s file size limit.
Already‑compressed archives (.zip, .7z) are stored as‑is – only split if they exceed the threshold.
All other files are compressed with Deflate (level 9) by default.
This tool is provided “as is”, without warranty of any kind.
I do not take any responsibility for the content that is downloaded, stored, or distributed by anyone using this script.
All files mirrored by this repository are the property of their respective owners. This project is intended for personal/archival use, software preservation, and fair‑use mirroring only.
If you are a copyright holder and believe your work is being mirrored without permission, please open an issue and the content will be removed immediately.
I do not endorse, verify, or guarantee the safety of any linked content.
Download and use mirrored files at your own risk.
- Reads
repo.txtfor a list of GitHub repositories (or direct URLs). - Fetches the latest release (including pre‑releases if
[pre]is set). - Downloads only the assets that match your filters.
.zip/.7zfiles are left untouched (store mode) – they are never re‑compressed.- All other files are compressed into a
.ziparchive with Deflate level 9.- If compression doesn’t reduce the size, the original file is kept as‑is.
- Any file larger than 99 MB is split into store‑mode
.zipvolumes. - Writes per‑folder
README.mdfiles (shown insideINDEX.md). - Pushes everything back to your repository.
All settings can be adjusted in config.toml (compression level, split size, etc.).
- Copy the workflow file into
.github/workflows/downloader.yml. - Ensure the workflow has
contents: writepermission. - The script is stored at
scripts/download_manager.py. - Create a
repo.txt(see syntax below). - (Optional) Create a
config.tomlnext to the script for custom settings.
The workflow runs every 24 hours and on every push to main.
| Syntax | Explanation |
|---|---|
owner/repo |
Download all assets with default extensions (.exe, .zip, .apk) |
owner/repo [ext1, ext2] |
Only assets ending with those extensions |
owner/repo [file*name.exe] |
Globs – wildcards * and ? allowed |
owner/repo [all] |
All assets (except checksum/signature files) |
owner/repo [nocompress] |
Keep files ≤ 99 MB raw, split larger ones without compression |
owner/repo [pre] |
Fetch the absolute latest release (including pre‑releases) |
owner/repo [lfs] |
Use Git LFS for the file – no compression, no splitting |
https://github.com/owner/repo/releases/latest [filter] |
Full release URL, automatically converted |
https://example.com/file.zip |
Direct download URL |
https://example.com/file.zip [nocompress] |
Direct download without compression |
Flags can be combined: [pre, nocompress, all]
Push a commit containing one or more URLs – the workflow will download them immediately.
Add [nocompress] anywhere in the commit message to skip compression for all URLs in that commit.
git commit -m "https://example.com/tool.zip [nocompress]"
Download a specific byte range of a large file.
Commit message format:
URL [startMB-endMB]
Example – download the first 200 MB:
https://example.com/big.iso [0-200]
Each download creates a timestamped folder:
downloads/– direct URLsrepos/– GitHub releases
Inside each folder:
- The mirrored file(s) – either raw or inside a
.zipcontainer. README.md– list of files with sizes, CRC32 hashes, and compression percentages.metadata.json– URL, method, checksums, and asset info.
All files are accessible via raw GitHub links.
Single .zip file:
unzip file.zipSplit volumes (.z01, .z02, … .zip):
Place all parts in the same folder and run:
zip -FF file.zip --out repaired.zip && unzip repaired.zipOr use a tool like 7z:
7z x file.zip- GitHub releases – the script remembers the tag of the last mirrored release.
If the tag hasn’t changed, the entire release is skipped completely – no files are downloaded, overwritten, or pushed. - Direct downloads – once a direct URL has been downloaded, it’s recorded in
state.json. The same URL will never be downloaded again.
This keeps your repository small and the runs fast. No redundant commits are created, and the push step automatically avoids empty pushes.
The following extensions are always ignored, even with [all]:
.sha256, .sha256sum, .sha512, .sha512sum, .sha1, .sha1sum, .md5, .md5sum, .asc, .sig, .sign, .pgp, .blake2b, .blake2s, .sha3, and various .txt/.sums variants.
If a downloaded file has a missing or wrong extension (e.g. an executable named zyrln-linux-amd64 or an image served without an extension), the script detects the real file type using file --mime-type (or the magic library if available) and automatically renames the file to the correct extension. This happens before compression, so the final .zip always contains a properly named file.
application/x-dosexec→*.exeimage/png→*.pngaudio/flac→*.flac
If the type cannot be determined, the file is kept as‑is.
Inside each folder’s README.md (and therefore in INDEX.md), you’ll see:
- CRC32 checksum for every final file.
- Compression percentage (e.g.
-12.3%) showing the space saved compared to the original release asset (only for files that were actually compressed). - Already‑compressed archives (
.zip/.7z) will not show a percentage because they are stored as‑is.
When new files are added, the workflow pushes them in commits of up to 500 MB each. If the total size exceeds that, the files are split across multiple cumulative commits (each building on the previous one) to ensure a smooth push without hitting Git size limits.
Place a file named config.toml next to download_manager.py to override defaults:
split_mb = 99 # file‑size threshold for splitting (MiB)
push_batch_bytes = 500000000 # max bytes per git commit (500 MiB)
max_parallel = 4 # simultaneous downloads
compression_level = 9 # 0 = store, 1 = fastest, 9 = best
compression_method = "Deflate" # kept for compatibility, ignored
extract_archive_exts = [".zip", ".jar", ".war", ".ear"]
skip_asset_exts = [ … ] # list of extensions to ignoreIf the file is missing, defaults are used.
- Use glob filters to catch version‑independent installers (e.g.
app_*_setup.exe). - Use
[nocompress]for rule sets / config files that are updated frequently. - The workflow runs every 12 hours – no need to poll manually.
- If a push fails because of a 100 MB limit, check the logs – the split guarantee should have kicked in; if not, temporarily increase
SPLIT_MBto 95. .zipand.7zfiles are never re‑compressed; they are only split if they exceed the size limit.
# VPN apps
therealaleph/MasterHttpRelayVPN-RUST [all, pre]
ajavadinezhad/zyrln [all, pre]
# Tools
2dust/v2rayN [v2rayN-linux-64.zip, pre]
2dust/v2rayNG [v2rayNG_*_universal.apk, pre]
# Windows
imputnet/helium-windows [helium_*_x64-installer.exe, pre]
# Android
MetaCubeX/ClashMetaForAndroid [cmfa-*-meta-universal-release.apk, pre]
# Rules (raw, no compression)
Chocolate4U/Iran-v2ray-rules [all, nocompress, pre]
# Direct download
https://example.com/files/tool.bin