[ MAGI//ARCHIVE ] SYSTEM ACTIVE

MAGI//ARCHIVE

// SURVEILLANCE FEED

Discovered repositories from the open source frontier

View the Project on GitHub

Dao-AILab/flash-attention

fast attention algorithm for transformers, boosts speed and memory efficiency

Dao Ailab Flash Attention screenshot

View on index · View in 3D Map

Related repos