[ MAGI//ARCHIVE ] SYSTEM ACTIVE

MAGI//ARCHIVE

// SURVEILLANCE FEED

Discovered repositories from the open source frontier

View the Project on GitHub

GeeeekExplorer/nano-vllm

Minimal vLLM engine for fast offline LLM inference with a clean Python codebase

Geeeekexplorer Nano Vllm screenshot

View on index · View in 3D Map

Related repos