[ MAGI//ARCHIVE ] SYSTEM ACTIVE

MAGI//ARCHIVE

// SURVEILLANCE FEED

Discovered repositories from the open source frontier

View the Project on GitHub

lyogavin/airllm

Run 70B LLMs on a 4GB GPU with layer-wise inference and memory optimization, quantization optional

Lyogavin Airllm screenshot

View on index · View in 3D Map

Related repos