fix(learn): use more precise abbreviation for ReLU (#39065)
This commit is contained in:
@ -17,7 +17,7 @@ videoId: S45tqW6BqRs
|
|||||||
question:
|
question:
|
||||||
text: Which activation function switches values between -1 and 1?
|
text: Which activation function switches values between -1 and 1?
|
||||||
answers:
|
answers:
|
||||||
- Relu (Rectified Linear Unit)
|
- ReLU (Rectified Linear Unit)
|
||||||
- Tanh (Hyperbolic Tangent)
|
- Tanh (Hyperbolic Tangent)
|
||||||
- Sigmoid
|
- Sigmoid
|
||||||
solution: 2
|
solution: 2
|
||||||
|
Reference in New Issue
Block a user