fix(learn): use more precise abbreviation for ReLU (#39065)
This commit is contained in:
parent
b4926052f4
commit
c385c70423
@ -17,7 +17,7 @@ videoId: S45tqW6BqRs
|
||||
question:
|
||||
text: Which activation function switches values between -1 and 1?
|
||||
answers:
|
||||
- Relu (Rectified Linear Unit)
|
||||
- ReLU (Rectified Linear Unit)
|
||||
- Tanh (Hyperbolic Tangent)
|
||||
- Sigmoid
|
||||
solution: 2
|
||||
|
Loading…
x
Reference in New Issue
Block a user