[GH-PAGES] Updated website

This commit is contained in:
Philippe Tillet
2022-04-25 00:41:43 +00:00
parent 1581cf9d79
commit 21613349ac
158 changed files with 316 additions and 316 deletions

View File

@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: dcb51d10379790b23b9f305eb4789868
config: 2669b1fb9e287aa7444c99f28247942d
tags: 645f666f9bcd5a90fca523b33c5a78b7

Binary file not shown.

Binary file not shown.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 14 KiB

After

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 41 KiB

After

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 57 KiB

After

Width:  |  Height:  |  Size: 56 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 21 KiB

View File

@@ -238,14 +238,14 @@ We can now run the decorated function above. Pass `print_data=True` to see the p
3 32768.0 63.999998 63.999998
4 65536.0 127.999995 127.999995
5 131072.0 219.428568 219.428568
6 262144.0 341.333321 341.333321
6 262144.0 341.333321 384.000001
7 524288.0 472.615390 472.615390
8 1048576.0 614.400016 614.400016
9 2097152.0 702.171410 702.171410
10 4194304.0 780.190482 768.000002
9 2097152.0 722.823517 722.823517
10 4194304.0 780.190482 780.190482
11 8388608.0 812.429770 812.429770
12 16777216.0 833.084721 833.084721
13 33554432.0 842.004273 842.004273
13 33554432.0 842.004273 843.811163
14 67108864.0 847.448255 848.362445
15 134217728.0 849.737435 850.656574
@@ -255,7 +255,7 @@ We can now run the decorated function above. Pass `print_data=True` to see the p
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 1 minutes 48.554 seconds)
**Total running time of the script:** ( 1 minutes 39.155 seconds)
.. _sphx_glr_download_getting-started_tutorials_01-vector-add.py:

View File

@@ -278,17 +278,17 @@ We will then compare its performance against (1) :code:`torch.softmax` and (2) t
softmax-performance:
N Triton Torch (native) Torch (jit)
0 256.0 512.000001 546.133347 186.181817
0 256.0 512.000001 546.133347 190.511628
1 384.0 438.857137 558.545450 151.703707
2 512.0 468.114273 585.142849 154.566038
2 512.0 481.882344 606.814814 154.566038
3 640.0 465.454542 640.000002 158.759699
4 768.0 472.615390 664.216187 163.839992
4 768.0 463.698115 664.216187 163.839992
.. ... ... ... ...
93 12160.0 478.034381 405.333344 198.834951
94 12288.0 484.256150 415.222812 199.096718
95 12416.0 461.454135 411.296057 198.755369
96 12544.0 458.228323 412.971190 198.815254
97 12672.0 458.196617 411.679167 198.971549
93 12160.0 479.211815 405.333344 199.038365
94 12288.0 484.853264 415.222812 199.197579
95 12416.0 460.384708 412.149375 198.954424
96 12544.0 457.705824 412.546756 199.012395
97 12672.0 457.679461 411.679167 199.167004
[98 rows x 4 columns]
@@ -306,7 +306,7 @@ In the above plot, we can see that:
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 3 minutes 10.849 seconds)
**Total running time of the script:** ( 3 minutes 8.135 seconds)
.. _sphx_glr_download_getting-started_tutorials_02-fused-softmax.py:

View File

@@ -458,37 +458,37 @@ We can now compare the performance of our kernel against that of cuBLAS. Here we
matmul-performance:
M cuBLAS ... Triton Triton (+ LeakyReLU)
0 256.0 2.730667 ... 2.978909 2.978909
1 384.0 7.372800 ... 8.507077 8.507077
2 512.0 14.563555 ... 15.420235 15.420235
0 256.0 2.730667 ... 3.276800 3.276800
1 384.0 7.372800 ... 7.899428 7.899428
2 512.0 14.563555 ... 16.384000 16.384000
3 640.0 22.260869 ... 24.380953 24.380953
4 768.0 32.768000 ... 34.028308 34.028308
4 768.0 32.768000 ... 35.389441 34.028308
5 896.0 37.971025 ... 40.140799 40.140799
6 1024.0 49.932191 ... 53.773130 52.428801
6 1024.0 49.932191 ... 53.773130 53.773130
7 1152.0 45.242181 ... 48.161033 47.396572
8 1280.0 51.200001 ... 57.690139 57.690139
9 1408.0 64.138541 ... 68.147202 67.305878
9 1408.0 64.138541 ... 69.009825 68.147202
10 1536.0 79.526831 ... 79.526831 79.526831
11 1664.0 62.929456 ... 62.929456 62.492442
12 1792.0 72.512412 ... 63.142831 62.790080
13 1920.0 69.120002 ... 71.626943 70.892307
14 2048.0 73.584279 ... 78.033565 77.672296
11 1664.0 63.372618 ... 63.372618 62.929456
12 1792.0 72.983276 ... 63.499573 63.142831
13 1920.0 69.467336 ... 71.257735 71.257735
14 2048.0 73.262953 ... 78.033565 77.672296
15 2176.0 83.155572 ... 87.115360 86.739860
16 2304.0 68.251065 ... 78.064941 77.558029
17 2432.0 71.125224 ... 75.522751 75.320281
18 2560.0 77.833728 ... 82.331658 81.920002
19 2688.0 83.645109 ... 90.102270 90.102270
20 2816.0 80.026067 ... 83.712490 81.981598
21 2944.0 82.509987 ... 82.852924 82.646820
22 3072.0 82.540970 ... 86.845249 89.593522
23 3200.0 84.993363 ... 96.240602 96.096095
24 3328.0 83.034941 ... 83.905938 82.843841
25 3456.0 81.849303 ... 89.679166 90.382926
26 3584.0 85.797134 ... 93.467144 98.483450
27 3712.0 85.748791 ... 86.754095 88.170647
28 3840.0 81.798814 ... 87.841141 91.473945
29 3968.0 85.871877 ... 92.582651 84.738843
30 4096.0 94.386588 ... 84.894196 82.722796
17 2432.0 71.305746 ... 75.522751 75.320281
18 2560.0 77.833728 ... 82.331658 82.125311
19 2688.0 83.552988 ... 90.748936 90.316801
20 2816.0 84.035084 ... 83.873477 84.035084
21 2944.0 82.784108 ... 84.324925 84.040530
22 3072.0 82.540970 ... 89.170242 89.030036
23 3200.0 84.768213 ... 95.096582 95.380032
24 3328.0 83.808259 ... 85.500351 86.424125
25 3456.0 82.604067 ... 92.033756 91.719645
26 3584.0 87.466332 ... 92.696281 96.372338
27 3712.0 86.267139 ... 85.970176 88.248537
28 3840.0 82.716526 ... 86.400002 91.322872
29 3968.0 85.871877 ... 92.163097 87.472354
30 4096.0 93.924229 ... 94.055868 87.097813
[31 rows x 5 columns]
@@ -498,7 +498,7 @@ We can now compare the performance of our kernel against that of cuBLAS. Here we
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 6 minutes 44.900 seconds)
**Total running time of the script:** ( 6 minutes 0.518 seconds)
.. _sphx_glr_download_getting-started_tutorials_03-matrix-multiplication.py:

View File

@@ -240,7 +240,7 @@ References
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 0 minutes 0.325 seconds)
**Total running time of the script:** ( 0 minutes 0.012 seconds)
.. _sphx_glr_download_getting-started_tutorials_04-low-memory-dropout.py:

View File

@@ -38,36 +38,36 @@ Layer Normalization
layer-norm-backward:
N Triton Torch Apex
0 1024.0 114.306981 98.698793 315.076934
1 1536.0 118.153850 132.604320 341.333333
2 2048.0 125.068704 159.584422 332.108094
3 2560.0 119.300966 180.705883 332.108113
4 3072.0 123.912607 191.005181 320.556515
5 3584.0 127.809806 207.267476 310.527060
6 4096.0 130.723400 220.907859 296.990947
7 4608.0 105.325718 232.336141 287.999990
8 5120.0 108.647215 241.414550 283.787523
9 5632.0 109.714290 241.371422 288.204696
10 6144.0 112.133840 249.502530 287.438593
11 6656.0 112.733948 254.775119 285.257135
12 7168.0 114.994654 257.147993 281.098038
13 7680.0 115.128047 264.068761 281.834874
14 8192.0 115.856217 267.493874 282.889211
15 8704.0 94.054934 264.091015 282.673891
16 9216.0 96.755908 271.391419 287.625496
17 9728.0 97.768843 280.615388 288.950501
18 10240.0 100.105904 285.104413 289.129408
19 10752.0 101.036809 246.464170 290.594591
20 11264.0 102.945926 245.536784 286.676558
21 11776.0 103.601171 249.447482 288.981596
22 12288.0 106.159829 254.673582 294.911986
23 12800.0 106.187352 253.884294 289.811310
24 13312.0 107.463167 251.962147 289.129403
25 13824.0 107.544896 256.792581 292.313649
26 14336.0 109.435116 254.673567 287.198654
27 14848.0 109.276912 255.266469 289.481735
28 15360.0 110.669469 260.338991 290.496454
29 15872.0 110.863792 263.071829 291.898841
0 1024.0 114.306981 97.912354 303.407414
1 1536.0 118.153850 134.540150 341.333333
2 2048.0 125.068704 161.154101 334.367350
3 2560.0 119.766080 181.238943 330.322572
4 3072.0 124.121216 192.501302 323.368415
5 3584.0 127.242599 208.271186 311.652167
6 4096.0 130.549806 220.907859 296.990947
7 4608.0 105.526723 232.825259 287.251954
8 5120.0 108.743364 242.366855 284.444444
9 5632.0 109.625308 243.107920 290.060087
10 6144.0 112.133840 248.661056 286.879370
11 6656.0 112.733948 256.000009 285.767438
12 7168.0 114.917836 260.260201 284.821192
13 7680.0 115.128047 262.938666 280.121579
14 8192.0 115.856217 266.767970 284.526763
15 8704.0 94.182151 267.815384 285.377055
16 9216.0 96.713601 271.724806 287.999990
17 9728.0 97.687028 280.615388 290.027323
18 10240.0 100.065144 286.433562 290.153487
19 10752.0 101.115983 246.935876 290.594591
20 11264.0 103.024392 245.536784 286.980888
21 11776.0 103.373808 249.667843 288.981596
22 12288.0 106.083454 254.673582 294.617366
23 12800.0 106.004143 254.094291 288.180121
24 13312.0 107.067028 253.260416 290.443638
25 13824.0 107.440415 257.390218 292.056329
26 14336.0 109.296061 255.051144 286.959121
27 14848.0 109.143034 257.665934 289.952797
28 15360.0 110.802526 257.970599 288.000007
29 15872.0 110.702706 261.806182 290.341468
@@ -339,7 +339,7 @@ Layer Normalization
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 2 minutes 2.489 seconds)
**Total running time of the script:** ( 2 minutes 0.332 seconds)
.. _sphx_glr_download_getting-started_tutorials_05-layer-norm.py:

View File

@@ -5,16 +5,16 @@
Computation times
=================
**13:47.116** total execution time for **getting-started_tutorials** files:
**12:48.152** total execution time for **getting-started_tutorials** files:
+---------------------------------------------------------------------------------------------------------+-----------+--------+
| :ref:`sphx_glr_getting-started_tutorials_03-matrix-multiplication.py` (``03-matrix-multiplication.py``) | 06:44.900 | 0.0 MB |
| :ref:`sphx_glr_getting-started_tutorials_03-matrix-multiplication.py` (``03-matrix-multiplication.py``) | 06:00.518 | 0.0 MB |
+---------------------------------------------------------------------------------------------------------+-----------+--------+
| :ref:`sphx_glr_getting-started_tutorials_02-fused-softmax.py` (``02-fused-softmax.py``) | 03:10.849 | 0.0 MB |
| :ref:`sphx_glr_getting-started_tutorials_02-fused-softmax.py` (``02-fused-softmax.py``) | 03:08.135 | 0.0 MB |
+---------------------------------------------------------------------------------------------------------+-----------+--------+
| :ref:`sphx_glr_getting-started_tutorials_05-layer-norm.py` (``05-layer-norm.py``) | 02:02.489 | 0.0 MB |
| :ref:`sphx_glr_getting-started_tutorials_05-layer-norm.py` (``05-layer-norm.py``) | 02:00.332 | 0.0 MB |
+---------------------------------------------------------------------------------------------------------+-----------+--------+
| :ref:`sphx_glr_getting-started_tutorials_01-vector-add.py` (``01-vector-add.py``) | 01:48.554 | 0.0 MB |
| :ref:`sphx_glr_getting-started_tutorials_01-vector-add.py` (``01-vector-add.py``) | 01:39.155 | 0.0 MB |
+---------------------------------------------------------------------------------------------------------+-----------+--------+
| :ref:`sphx_glr_getting-started_tutorials_04-low-memory-dropout.py` (``04-low-memory-dropout.py``) | 00:00.325 | 0.0 MB |
| :ref:`sphx_glr_getting-started_tutorials_04-low-memory-dropout.py` (``04-low-memory-dropout.py``) | 00:00.012 | 0.0 MB |
+---------------------------------------------------------------------------------------------------------+-----------+--------+

View File

@@ -328,19 +328,19 @@ for different problem sizes.</p>
3 32768.0 63.999998 63.999998
4 65536.0 127.999995 127.999995
5 131072.0 219.428568 219.428568
6 262144.0 341.333321 341.333321
6 262144.0 341.333321 384.000001
7 524288.0 472.615390 472.615390
8 1048576.0 614.400016 614.400016
9 2097152.0 702.171410 702.171410
10 4194304.0 780.190482 768.000002
9 2097152.0 722.823517 722.823517
10 4194304.0 780.190482 780.190482
11 8388608.0 812.429770 812.429770
12 16777216.0 833.084721 833.084721
13 33554432.0 842.004273 842.004273
13 33554432.0 842.004273 843.811163
14 67108864.0 847.448255 848.362445
15 134217728.0 849.737435 850.656574
</pre></div>
</div>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes 48.554 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 1 minutes 39.155 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-01-vector-add-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/62d97d49a32414049819dd8bb8378080/01-vector-add.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">01-vector-add.py</span></code></a></p>

View File

@@ -369,17 +369,17 @@ We will then compare its performance against (1) <code class="code docutils lite
<p class="sphx-glr-script-out">Out:</p>
<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>softmax-performance:
N Triton Torch (native) Torch (jit)
0 256.0 512.000001 546.133347 186.181817
0 256.0 512.000001 546.133347 190.511628
1 384.0 438.857137 558.545450 151.703707
2 512.0 468.114273 585.142849 154.566038
2 512.0 481.882344 606.814814 154.566038
3 640.0 465.454542 640.000002 158.759699
4 768.0 472.615390 664.216187 163.839992
4 768.0 463.698115 664.216187 163.839992
.. ... ... ... ...
93 12160.0 478.034381 405.333344 198.834951
94 12288.0 484.256150 415.222812 199.096718
95 12416.0 461.454135 411.296057 198.755369
96 12544.0 458.228323 412.971190 198.815254
97 12672.0 458.196617 411.679167 198.971549
93 12160.0 479.211815 405.333344 199.038365
94 12288.0 484.853264 415.222812 199.197579
95 12416.0 460.384708 412.149375 198.954424
96 12544.0 457.705824 412.546756 199.012395
97 12672.0 457.679461 411.679167 199.167004
[98 rows x 4 columns]
</pre></div>
@@ -392,7 +392,7 @@ We will then compare its performance against (1) <code class="code docutils lite
Note however that the PyTorch <cite>softmax</cite> operation is more general and will works on tensors of any shape.</p></li>
</ul>
</div></blockquote>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 3 minutes 10.849 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 3 minutes 8.135 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-02-fused-softmax-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/d91442ac2982c4e0cc3ab0f43534afbc/02-fused-softmax.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">02-fused-softmax.py</span></code></a></p>

View File

@@ -564,42 +564,42 @@ torch_output=tensor([[ 1.1045, -36.9688, 31.4688, ..., -11.3906, 24.4531, -3
<p class="sphx-glr-script-out">Out:</p>
<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>matmul-performance:
M cuBLAS ... Triton Triton (+ LeakyReLU)
0 256.0 2.730667 ... 2.978909 2.978909
1 384.0 7.372800 ... 8.507077 8.507077
2 512.0 14.563555 ... 15.420235 15.420235
0 256.0 2.730667 ... 3.276800 3.276800
1 384.0 7.372800 ... 7.899428 7.899428
2 512.0 14.563555 ... 16.384000 16.384000
3 640.0 22.260869 ... 24.380953 24.380953
4 768.0 32.768000 ... 34.028308 34.028308
4 768.0 32.768000 ... 35.389441 34.028308
5 896.0 37.971025 ... 40.140799 40.140799
6 1024.0 49.932191 ... 53.773130 52.428801
6 1024.0 49.932191 ... 53.773130 53.773130
7 1152.0 45.242181 ... 48.161033 47.396572
8 1280.0 51.200001 ... 57.690139 57.690139
9 1408.0 64.138541 ... 68.147202 67.305878
9 1408.0 64.138541 ... 69.009825 68.147202
10 1536.0 79.526831 ... 79.526831 79.526831
11 1664.0 62.929456 ... 62.929456 62.492442
12 1792.0 72.512412 ... 63.142831 62.790080
13 1920.0 69.120002 ... 71.626943 70.892307
14 2048.0 73.584279 ... 78.033565 77.672296
11 1664.0 63.372618 ... 63.372618 62.929456
12 1792.0 72.983276 ... 63.499573 63.142831
13 1920.0 69.467336 ... 71.257735 71.257735
14 2048.0 73.262953 ... 78.033565 77.672296
15 2176.0 83.155572 ... 87.115360 86.739860
16 2304.0 68.251065 ... 78.064941 77.558029
17 2432.0 71.125224 ... 75.522751 75.320281
18 2560.0 77.833728 ... 82.331658 81.920002
19 2688.0 83.645109 ... 90.102270 90.102270
20 2816.0 80.026067 ... 83.712490 81.981598
21 2944.0 82.509987 ... 82.852924 82.646820
22 3072.0 82.540970 ... 86.845249 89.593522
23 3200.0 84.993363 ... 96.240602 96.096095
24 3328.0 83.034941 ... 83.905938 82.843841
25 3456.0 81.849303 ... 89.679166 90.382926
26 3584.0 85.797134 ... 93.467144 98.483450
27 3712.0 85.748791 ... 86.754095 88.170647
28 3840.0 81.798814 ... 87.841141 91.473945
29 3968.0 85.871877 ... 92.582651 84.738843
30 4096.0 94.386588 ... 84.894196 82.722796
17 2432.0 71.305746 ... 75.522751 75.320281
18 2560.0 77.833728 ... 82.331658 82.125311
19 2688.0 83.552988 ... 90.748936 90.316801
20 2816.0 84.035084 ... 83.873477 84.035084
21 2944.0 82.784108 ... 84.324925 84.040530
22 3072.0 82.540970 ... 89.170242 89.030036
23 3200.0 84.768213 ... 95.096582 95.380032
24 3328.0 83.808259 ... 85.500351 86.424125
25 3456.0 82.604067 ... 92.033756 91.719645
26 3584.0 87.466332 ... 92.696281 96.372338
27 3712.0 86.267139 ... 85.970176 88.248537
28 3840.0 82.716526 ... 86.400002 91.322872
29 3968.0 85.871877 ... 92.163097 87.472354
30 4096.0 93.924229 ... 94.055868 87.097813
[31 rows x 5 columns]
</pre></div>
</div>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 6 minutes 44.900 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 6 minutes 0.518 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-03-matrix-multiplication-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/d5fee5b55a64e47f1b5724ec39adf171/03-matrix-multiplication.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">03-matrix-multiplication.py</span></code></a></p>

View File

@@ -372,7 +372,7 @@ to explore the <cite>triton/language/random</cite> folder!</p>
<dd><p>Nitish Srivastava and Geoffrey Hinton and Alex Krizhevsky and Ilya Sutskever and Ruslan Salakhutdinov, “Dropout: A Simple Way to Prevent Neural Networks from Overfitting”, JMLR 2014</p>
</dd>
</dl>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 0 minutes 0.325 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 0 minutes 0.012 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-04-low-memory-dropout-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/c9aed78977a4c05741d675a38dde3d7d/04-low-memory-dropout.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">04-low-memory-dropout.py</span></code></a></p>

View File

@@ -194,36 +194,36 @@ to download the full example code</p>
<p class="sphx-glr-script-out">Out:</p>
<div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>layer-norm-backward:
N Triton Torch Apex
0 1024.0 114.306981 98.698793 315.076934
1 1536.0 118.153850 132.604320 341.333333
2 2048.0 125.068704 159.584422 332.108094
3 2560.0 119.300966 180.705883 332.108113
4 3072.0 123.912607 191.005181 320.556515
5 3584.0 127.809806 207.267476 310.527060
6 4096.0 130.723400 220.907859 296.990947
7 4608.0 105.325718 232.336141 287.999990
8 5120.0 108.647215 241.414550 283.787523
9 5632.0 109.714290 241.371422 288.204696
10 6144.0 112.133840 249.502530 287.438593
11 6656.0 112.733948 254.775119 285.257135
12 7168.0 114.994654 257.147993 281.098038
13 7680.0 115.128047 264.068761 281.834874
14 8192.0 115.856217 267.493874 282.889211
15 8704.0 94.054934 264.091015 282.673891
16 9216.0 96.755908 271.391419 287.625496
17 9728.0 97.768843 280.615388 288.950501
18 10240.0 100.105904 285.104413 289.129408
19 10752.0 101.036809 246.464170 290.594591
20 11264.0 102.945926 245.536784 286.676558
21 11776.0 103.601171 249.447482 288.981596
22 12288.0 106.159829 254.673582 294.911986
23 12800.0 106.187352 253.884294 289.811310
24 13312.0 107.463167 251.962147 289.129403
25 13824.0 107.544896 256.792581 292.313649
26 14336.0 109.435116 254.673567 287.198654
27 14848.0 109.276912 255.266469 289.481735
28 15360.0 110.669469 260.338991 290.496454
29 15872.0 110.863792 263.071829 291.898841
0 1024.0 114.306981 97.912354 303.407414
1 1536.0 118.153850 134.540150 341.333333
2 2048.0 125.068704 161.154101 334.367350
3 2560.0 119.766080 181.238943 330.322572
4 3072.0 124.121216 192.501302 323.368415
5 3584.0 127.242599 208.271186 311.652167
6 4096.0 130.549806 220.907859 296.990947
7 4608.0 105.526723 232.825259 287.251954
8 5120.0 108.743364 242.366855 284.444444
9 5632.0 109.625308 243.107920 290.060087
10 6144.0 112.133840 248.661056 286.879370
11 6656.0 112.733948 256.000009 285.767438
12 7168.0 114.917836 260.260201 284.821192
13 7680.0 115.128047 262.938666 280.121579
14 8192.0 115.856217 266.767970 284.526763
15 8704.0 94.182151 267.815384 285.377055
16 9216.0 96.713601 271.724806 287.999990
17 9728.0 97.687028 280.615388 290.027323
18 10240.0 100.065144 286.433562 290.153487
19 10752.0 101.115983 246.935876 290.594591
20 11264.0 103.024392 245.536784 286.980888
21 11776.0 103.373808 249.667843 288.981596
22 12288.0 106.083454 254.673582 294.617366
23 12800.0 106.004143 254.094291 288.180121
24 13312.0 107.067028 253.260416 290.443638
25 13824.0 107.440415 257.390218 292.056329
26 14336.0 109.296061 255.051144 286.959121
27 14848.0 109.143034 257.665934 289.952797
28 15360.0 110.802526 257.970599 288.000007
29 15872.0 110.702706 261.806182 290.341468
</pre></div>
</div>
<div class="line-block">
@@ -487,7 +487,7 @@ to download the full example code</p>
<span class="n">bench_layer_norm</span><span class="o">.</span><span class="n">run</span><span class="p">(</span><span class="n">save_path</span><span class="o">=</span><span class="s1">&#39;.&#39;</span><span class="p">,</span> <span class="n">print_data</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
</pre></div>
</div>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 2 minutes 2.489 seconds)</p>
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> ( 2 minutes 0.332 seconds)</p>
<div class="sphx-glr-footer class sphx-glr-footer-example docutils container" id="sphx-glr-download-getting-started-tutorials-05-layer-norm-py">
<div class="sphx-glr-download sphx-glr-download-python docutils container">
<p><a class="reference download internal" download="" href="../../_downloads/935c0dd0fbeb4b2e69588471cbb2d4b2/05-layer-norm.py"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Python</span> <span class="pre">source</span> <span class="pre">code:</span> <span class="pre">05-layer-norm.py</span></code></a></p>

View File

@@ -174,7 +174,7 @@
<div class="section" id="computation-times">
<span id="sphx-glr-getting-started-tutorials-sg-execution-times"></span><h1>Computation times<a class="headerlink" href="#computation-times" title="Permalink to this headline"></a></h1>
<p><strong>13:47.116</strong> total execution time for <strong>getting-started_tutorials</strong> files:</p>
<p><strong>12:48.152</strong> total execution time for <strong>getting-started_tutorials</strong> files:</p>
<table class="docutils align-default">
<colgroup>
<col style="width: 85%" />
@@ -183,23 +183,23 @@
</colgroup>
<tbody>
<tr class="row-odd"><td><p><a class="reference internal" href="03-matrix-multiplication.html#sphx-glr-getting-started-tutorials-03-matrix-multiplication-py"><span class="std std-ref">Matrix Multiplication</span></a> (<code class="docutils literal notranslate"><span class="pre">03-matrix-multiplication.py</span></code>)</p></td>
<td><p>06:44.900</p></td>
<td><p>06:00.518</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-even"><td><p><a class="reference internal" href="02-fused-softmax.html#sphx-glr-getting-started-tutorials-02-fused-softmax-py"><span class="std std-ref">Fused Softmax</span></a> (<code class="docutils literal notranslate"><span class="pre">02-fused-softmax.py</span></code>)</p></td>
<td><p>03:10.849</p></td>
<td><p>03:08.135</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="05-layer-norm.html#sphx-glr-getting-started-tutorials-05-layer-norm-py"><span class="std std-ref">Layer Normalization</span></a> (<code class="docutils literal notranslate"><span class="pre">05-layer-norm.py</span></code>)</p></td>
<td><p>02:02.489</p></td>
<td><p>02:00.332</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-even"><td><p><a class="reference internal" href="01-vector-add.html#sphx-glr-getting-started-tutorials-01-vector-add-py"><span class="std std-ref">Vector Addition</span></a> (<code class="docutils literal notranslate"><span class="pre">01-vector-add.py</span></code>)</p></td>
<td><p>01:48.554</p></td>
<td><p>01:39.155</p></td>
<td><p>0.0 MB</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="04-low-memory-dropout.html#sphx-glr-getting-started-tutorials-04-low-memory-dropout-py"><span class="std std-ref">Low-Memory Dropout</span></a> (<code class="docutils literal notranslate"><span class="pre">04-low-memory-dropout.py</span></code>)</p></td>
<td><p>00:00.325</p></td>
<td><p>00:00.012</p></td>
<td><p>0.0 MB</p></td>
</tr>
</tbody>

File diff suppressed because one or more lines are too long

View File

@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 859316008f2f6719727d4fb3b970c7e2
config: 0f666e711caacc26933f91d5b974ab34
tags: 645f666f9bcd5a90fca523b33c5a78b7

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More