# Vlfs [![Download vlfs.py](https://img.shields.io/badge/Download-vlfs.py-blue?logo=python)](https://raw.githubusercontent.com/hypernewbie/vlfs/main/vlfs.py) [![CI](https://github.com/hypernewbie/vlfs/actions/workflows/ci.yml/badge.svg)](https://github.com/hypernewbie/vlfs/actions/workflows/ci.yml) AI Slop single-script Python script Git LFS alternative. Public files via HTTP, private files via Google Drive. > Vlfs is vibe coded by AI with only minor review. It's intended to be a throaway script. Do not use. ## Quick Start ```bash python vlfs.py pull ``` ## Architecture Content-addressable storage: SHA256 hashing, 2-level sharding (`ab/cd/hash`), zstd compression. ``` .vlfs/ config.toml # Repo config (public_base_url, compression) index.json # File manifest (committed) ~/.config/vlfs/ config.toml # User secrets (Drive OAuth) rclone.conf # Generated rclone config gdrive-token.json # OAuth token .vlfs-cache/ objects/ # Local cache ``` ## Git Integration Commit `.vlfs/` to track your large files. Add `.vlfs-cache/` to your `.gitignore` to keep the data blobs out of Git. ## Usage ```bash # Pull (no auth needed for public files) python vlfs.py pull # Push to R2 (requires credentials) python vlfs.py push tools/clang.exe python vlfs.py push tools/ python vlfs.py push --glob "**/*.dll" python vlfs.py push --all # Push to Drive (private) python vlfs.py push --private assets/art.psd # Status python vlfs.py status python vlfs.py verify python vlfs.py clean ``` Most commands now keep default output intentionally terse with bracketed progress and a final summary line. Use `-v` when you want the older step-by-step detail and transfer chatter. ## Configuration `.vlfs/config.toml` (committed): ```toml [remotes.r2] public_base_url = "https://pub-abc123.r2.dev/vlfs" bucket = "my-project-assets" # Optional (default: "vlfs") [remotes.gdrive] bucket = "my-project-drive-root" # Optional (default: "vlfs") [defaults] compression_level = 3 ``` ## Google Drive Setup ```bash python vlfs.py auth gdrive ``` This opens a browser for you to authorise access. Done. ## Cloudflare R2 Setup The CLI uses environment variables for authentication. Set these in your shell (or `.env` file): ```bash export RCLONE_CONFIG_R2_ACCESS_KEY_ID="your-access-key" export RCLONE_CONFIG_R2_SECRET_ACCESS_KEY="your-secret-key" export RCLONE_CONFIG_R2_ENDPOINT="https://.r2.cloudflarestorage.com" ``` ### Persistent Config (Optional) If you prefer not to use environment variables, you can create a persistent config file at `~/.config/vlfs/rclone.conf`: ```ini [r2] type = s3 provider = Cloudflare access_key_id = your-access-key secret_access_key = your-secret-key endpoint = https://.r2.cloudflarestorage.com ``` ## CMake Integration ```cmake include(VLFSSync.cmake) set(VLFSSYNC_AUTO ON) # Auto-pull on configure ``` ## How It Works | Operation | Auth | Method | |-----------|------|--------| | `pull` (R2) | None | HTTP GET | | `pull` (Drive) | Token | rclone | | `push` (R2) | Env vars | rclone | | `push --private` | Token | rclone | ## FAQ **How does sharding work?** It uses **directory sharding** to keep folders clean (`vlfs/ab/cd/hash`), not file chunking. One local file equals one cloud object. **What compression is used?** Zstandard (zstd) level 3. It's extremely fast for real-time compression and makes decompression feel instant. **Is it multithreaded?** Mostly. Hashing and downloading (HTTP/Rclone) are parallel/multithreaded. Uploading is currently sequential. Google Drive transfers are single-threaded to respect API rate limits. ## Dependencies - Python 3.10+ - `zstandard` - `rclone` (push only) ### Installing rclone > [!IMPORTANT] > **Windows Warning:** `winget` often fails to create the required symlinks when run from PowerShell. To ensure `rclone` is correctly added to your PATH, run the installation from an **Administrator Command Prompt (cmd.exe)**. ```bash # Windows (Run as Administrator in cmd.exe) winget install Rclone.Rclone # macOS brew install rclone # Linux curl https://rclone.org/install.sh | sudo bash ``` ## Testing ```bash pip install -e ".[dev]" pytest ```