[ MAGI//ARCHIVE ] SYSTEM ACTIVE

MAGI//ARCHIVE

// SURVEILLANCE FEED

Discovered repositories from the open source frontier

View the Project on GitHub

fla-org/flash-linear-attention

Efficient Triton-based linear attention kernels for PyTorch and multiple hardware platforms

Fla Org Flash Linear Attention screenshot

View on index · View in 3D Map

Related repos