[tutorials] Added flash attention credits in tutorial

This commit is contained in:
Phil Tillet
2022-07-11 18:56:48 -07:00
parent d5eb9bc230
commit 971f5782b4

View File

@@ -1,6 +1,7 @@
"""
Fused Attention
===============
This is a Triton implementation of the Flash Attention algorithm (Dao et al., https://arxiv.org/pdf/2205.14135v2.pdf)
"""
import pytest