fix(learn): use more precise abbreviation for ReLU (#39065)

This commit is contained in:
Eric Leung
2020-06-13 02:41:00 -07:00
committed by GitHub
parent b4926052f4
commit c385c70423

View File

@ -17,7 +17,7 @@ videoId: S45tqW6BqRs
question: question:
text: Which activation function switches values between -1 and 1? text: Which activation function switches values between -1 and 1?
answers: answers:
- Relu (Rectified Linear Unit) - ReLU (Rectified Linear Unit)
- Tanh (Hyperbolic Tangent) - Tanh (Hyperbolic Tangent)
- Sigmoid - Sigmoid
solution: 2 solution: 2