Mega4alik/ollm
Run huge LLMs with 100k context on an 8GB GPU using SSD offload, no quantization

View on index · View in 3D Map
// SURVEILLANCE FEED
Discovered repositories from the open source frontier
Run huge LLMs with 100k context on an 8GB GPU using SSD offload, no quantization

View on index · View in 3D Map