[tutorials] Added flash attention credits in tutorial
This commit is contained in:
@@ -1,6 +1,7 @@
|
|||||||
"""
|
"""
|
||||||
Fused Attention
|
Fused Attention
|
||||||
===============
|
===============
|
||||||
|
This is a Triton implementation of the Flash Attention algorithm (Dao et al., https://arxiv.org/pdf/2205.14135v2.pdf)
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
Reference in New Issue
Block a user