[GH-PAGES] Updated website

This commit is contained in:
Philippe Tillet
2022-04-08 00:44:05 +00:00
parent 80b92a0d2d
commit 0c570c178d
173 changed files with 401 additions and 386 deletions

View File

@@ -323,23 +323,23 @@ for different problem sizes.</p>
size Triton Torch
0 4096.0 9.600000 9.600000
1 8192.0 19.200000 19.200000
2 16384.0 31.999999 38.400001
2 16384.0 38.400001 38.400001
3 32768.0 76.800002 76.800002
4 65536.0 127.999995 127.999995
5 131072.0 219.428568 219.428568
6 262144.0 341.333321 341.333321
6 262144.0 341.333321 384.000001
7 524288.0 472.615390 472.615390
8 1048576.0 614.400016 614.400016
9 2097152.0 722.823517 722.823517
9 2097152.0 722.823517 702.171410
10 4194304.0 780.190482 780.190482
11 8388608.0 812.429770 812.429770
12 16777216.0 833.084721 833.084721
13 33554432.0 842.004273 843.811163
13 33554432.0 842.004273 842.004273
14 67108864.0 847.448255 848.362445
15 134217728.0 849.737435 850.656574
</pre></div>
</div>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes 51.595 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes 44.469 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-01-vector-add-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/62d97d49a32414049819dd8bb8378080/01-vector-add.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">01-vector-add.py</span></code></a></p>

View File

@@ -375,16 +375,16 @@ We will then compare its performance against (1) <code class="code docutils lite
<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>softmax-performance:
N Triton Torch (native) Torch (jit)
0 256.0 512.000001 546.133347 188.321838
1 384.0 585.142862 585.142862 153.600004
2 512.0 655.360017 606.814814 154.566038
1 384.0 585.142862 585.142862 151.703707
2 512.0 655.360017 585.142849 156.038096
3 640.0 682.666684 640.000002 160.000000
4 768.0 722.823517 664.216187 162.754967
4 768.0 722.823517 646.736871 163.839992
.. ... ... ... ...
93 12160.0 814.058574 406.179533 198.936606
94 12288.0 814.111783 415.661740 199.298541
95 12416.0 812.498981 412.149375 198.854847
96 12544.0 812.566838 412.971190 199.111113
97 12672.0 812.633240 412.097543 199.167004
93 12160.0 812.359066 405.755985 198.328233
94 12288.0 814.111783 415.661740 198.694297
95 12416.0 813.330613 412.149375 198.407990
96 12544.0 812.566838 412.971190 198.569388
97 12672.0 812.633240 412.097543 198.776477
[98 rows x 4 columns]
</pre></div>
@@ -397,7 +397,7 @@ We will then compare its performance against (1) <code class="code docutils lite
Note however that the PyTorch <cite>softmax</cite> operation is more general and will works on tensors of any shape.</p></li>
</ul>
</div></blockquote>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 3 minutes 25.280 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 3 minutes 23.413 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-02-fused-softmax-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/d91442ac2982c4e0cc3ab0f43534afbc/02-fused-softmax.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">02-fused-softmax.py</span></code></a></p>

View File

@@ -569,41 +569,41 @@ torch_output=tensor([[ 1.1045, -36.9688, 31.4688, ..., -11.3906, 24.4531, -3
<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>matmul-performance:
M cuBLAS ... Triton Triton (+ LeakyReLU)
0 256.0 2.730667 ... 3.276800 2.978909
1 384.0 7.372800 ... 8.507077 7.899428
1 384.0 7.372800 ... 8.507077 8.507077
2 512.0 14.563555 ... 16.384000 16.384000
3 640.0 22.260869 ... 24.380953 24.380953
4 768.0 32.768000 ... 34.028308 34.028308
5 896.0 37.971025 ... 40.140799 39.025776
6 1024.0 51.150050 ... 53.773130 52.428801
5 896.0 37.971025 ... 39.025776 39.025776
6 1024.0 49.932191 ... 52.428801 52.428801
7 1152.0 45.242181 ... 46.656000 46.656000
8 1280.0 51.200001 ... 56.888887 56.888887
9 1408.0 64.138541 ... 67.305878 66.485074
10 1536.0 80.430545 ... 79.526831 79.526831
11 1664.0 62.929456 ... 62.492442 62.061463
12 1792.0 72.983276 ... 71.588687 71.588687
13 1920.0 69.120002 ... 70.172588 70.172588
14 2048.0 73.584279 ... 76.608294 76.260072
11 1664.0 63.372618 ... 62.492442 62.061463
12 1792.0 72.983276 ... 72.047592 71.588687
13 1920.0 68.776119 ... 70.172588 70.172588
14 2048.0 73.908442 ... 76.959706 76.608294
15 2176.0 83.155572 ... 85.998493 85.269692
16 2304.0 68.446623 ... 76.809875 76.563695
17 2432.0 71.305746 ... 82.874527 84.877538
18 2560.0 78.019048 ... 80.709358 80.511054
19 2688.0 83.369354 ... 89.044730 89.676257
20 2816.0 83.233226 ... 82.916747 82.602666
21 2944.0 81.832567 ... 80.640830 82.169877
22 3072.0 82.181572 ... 88.473602 88.473602
23 3200.0 84.768213 ... 95.096582 94.395283
24 3328.0 82.939284 ... 81.346098 84.695641
25 3456.0 81.932484 ... 90.180725 91.097818
26 3584.0 87.085130 ... 98.268190 93.564405
27 3712.0 83.317214 ... 88.443865 83.947349
28 3840.0 84.874902 ... 91.322872 84.421376
29 3968.0 90.388098 ... 84.210698 88.615785
30 4096.0 86.480498 ... 92.948562 88.243079
16 2304.0 68.446623 ... 76.563695 76.319081
17 2432.0 71.125224 ... 85.134737 84.621881
18 2560.0 77.833728 ... 81.108913 80.908642
19 2688.0 83.737433 ... 89.888756 89.464755
20 2816.0 83.074685 ... 82.759409 81.218262
21 2944.0 81.698415 ... 80.510553 79.737653
22 3072.0 82.181572 ... 88.612060 88.335577
23 3200.0 82.156612 ... 93.841640 93.979441
24 3328.0 81.530349 ... 84.795401 84.596116
25 3456.0 82.141178 ... 88.595129 89.281913
26 3584.0 87.296493 ... 91.563533 96.683219
27 3712.0 85.675250 ... 86.191546 88.561477
28 3840.0 79.448276 ... 84.548438 91.322872
29 3968.0 86.973584 ... 90.791620 85.871877
30 4096.0 93.142072 ... 82.040176 89.062862
[31 rows x 5 columns]
</pre></div>
</div>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 5 minutes 51.135 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 5 minutes 44.408 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-03-matrix-multiplication-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/d5fee5b55a64e47f1b5724ec39adf171/03-matrix-multiplication.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">03-matrix-multiplication.py</span></code></a></p>

View File

@@ -194,36 +194,36 @@ to download the full example code</p>
<p class="sphx-glr-script-out">Out:</p>
<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>layer-norm-backward:
N Triton Torch Apex
0 1024.0 307.200008 98.303995 303.407414
1 1536.0 351.085717 134.050910 341.333333
2 2048.0 423.724127 161.684218 334.367350
3 2560.0 465.454542 181.775141 328.556154
4 3072.0 511.999982 192.501302 320.556515
5 3584.0 554.941930 208.271186 310.527060
6 4096.0 568.231237 220.412561 297.890900
7 4608.0 498.162157 232.825259 287.251954
8 5120.0 527.381977 242.845844 285.104413
9 5632.0 540.671974 243.545956 289.438969
10 6144.0 544.118087 248.661056 286.322318
11 6656.0 532.479975 256.000009 285.767438
12 7168.0 507.469040 260.260201 286.242939
13 7680.0 479.999983 262.564106 279.272719
14 8192.0 463.698115 267.130429 284.526763
15 8704.0 416.958106 267.815384 284.987724
16 9216.0 429.483477 272.394084 288.751954
17 9728.0 437.213490 280.278512 290.027323
18 10240.0 449.287041 286.767793 290.840246
19 10752.0 428.651173 247.172406 290.594591
20 11264.0 429.104745 245.536784 286.980888
21 11776.0 423.724129 249.888595 288.981596
22 12288.0 420.102570 254.673582 294.323369
23 12800.0 414.574901 253.465340 288.450715
24 13312.0 412.242569 252.759501 289.916513
25 13824.0 405.098897 257.190689 292.056329
26 14336.0 395.021816 254.297107 286.719986
27 14848.0 385.662341 257.479779 289.481735
28 15360.0 373.874218 257.970599 287.550706
29 15872.0 369.474279 261.806182 289.899545
0 1024.0 311.088617 99.902435 315.076934
1 1536.0 351.085717 134.050910 344.523365
2 2048.0 423.724127 159.067963 334.367350
3 2560.0 461.954908 182.857144 330.322572
4 3072.0 519.211251 191.501303 321.956335
5 3584.0 547.872604 207.768111 309.410081
6 4096.0 568.231237 221.405403 301.546004
7 4608.0 500.416301 232.336141 287.251954
8 5120.0 529.655159 243.809526 287.102804
9 5632.0 540.671974 244.869560 291.939522
10 6144.0 548.163546 251.202731 286.879370
11 6656.0 536.053693 256.000009 286.793541
12 7168.0 518.168681 253.734520 277.919225
13 7680.0 488.912481 266.358392 280.547947
14 8192.0 464.794337 258.354805 278.481578
15 8704.0 416.958106 267.472468 284.987724
16 9216.0 432.000001 272.394084 289.887291
17 9728.0 439.683593 280.278512 288.950501
18 10240.0 446.836366 287.102804 287.775181
19 10752.0 430.079980 246.699797 289.941565
20 11264.0 430.471331 245.313973 286.069848
21 11776.0 420.571432 249.447482 288.981596
22 12288.0 418.909088 254.673582 294.323369
23 12800.0 414.016170 254.094291 289.538159
24 13312.0 411.711355 252.360194 289.391298
25 13824.0 404.604870 257.190689 291.799461
26 14336.0 395.021816 256.000002 289.129416
27 14848.0 385.245405 257.479779 289.012175
28 15360.0 376.932517 258.332158 287.550706
29 15872.0 369.832994 261.626369 290.784741
</pre></div>
</div>
<div class="line-block">
@@ -477,7 +477,7 @@ to download the full example code</p>
<span class="n">bench_layer_norm</span><span class="o">.</span><span class="n">run</span><span class="p">(</span><span class="n">save_path</span><span class="o">=</span><span class="s1">&#39;.&#39;</span><span class="p">,</span> <span class="n">print_data</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
</pre></div>
</div>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 2 minutes 13.604 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 2 minutes 13.257 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-05-layer-norm-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/935c0dd0fbeb4b2e69588471cbb2d4b2/05-layer-norm.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">05-layer-norm.py</span></code></a></p>

View File

@@ -174,7 +174,7 @@
<div class="section" id="computation-times">
<span id="sphx-glr-getting-started-tutorials-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline"></a></h1>
<p><strong>13:21.722</strong> total execution time for <strong>getting-started_tutorials</strong> files:</p>
<p><strong>13:05.655</strong> total execution time for <strong>getting-started_tutorials</strong> files:</p>
<table class="docutils align-default">
<colgroup>
<col style="width: 85%" />
@@ -183,19 +183,19 @@
</colgroup>
<tbody>
<tr class="row-odd"><td><p><a class="reference internal" href="03-matrix-multiplication.html#sphx-glr-getting-started-tutorials-03-matrix-multiplication-py"><span class="std std-ref">Matrix Multiplication</span></a> (<code class="docutils literal notranslate"><span class="pre">03-matrix-multiplication.py</span></code>)</p></td>
<td><p>05:51.135</p></td>
<td><p>05:44.408</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-even"><td><p><a class="reference internal" href="02-fused-softmax.html#sphx-glr-getting-started-tutorials-02-fused-softmax-py"><span class="std std-ref">Fused Softmax</span></a> (<code class="docutils literal notranslate"><span class="pre">02-fused-softmax.py</span></code>)</p></td>
<td><p>03:25.280</p></td>
<td><p>03:23.413</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="05-layer-norm.html#sphx-glr-getting-started-tutorials-05-layer-norm-py"><span class="std std-ref">Layer Normalization</span></a> (<code class="docutils literal notranslate"><span class="pre">05-layer-norm.py</span></code>)</p></td>
<td><p>02:13.604</p></td>
<td><p>02:13.257</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-even"><td><p><a class="reference internal" href="01-vector-add.html#sphx-glr-getting-started-tutorials-01-vector-add-py"><span class="std std-ref">Vector Addition</span></a> (<code class="docutils literal notranslate"><span class="pre">01-vector-add.py</span></code>)</p></td>
<td><p>01:51.595</p></td>
<td><p>01:44.469</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="04-low-memory-dropout.html#sphx-glr-getting-started-tutorials-04-low-memory-dropout-py"><span class="std std-ref">Low-Memory Dropout</span></a> (<code class="docutils literal notranslate"><span class="pre">04-low-memory-dropout.py</span></code>)</p></td>