From c385c704235b49abaf4df0b9829abccd2264ece9 Mon Sep 17 00:00:00 2001 From: Eric Leung Date: Sat, 13 Jun 2020 02:41:00 -0700 Subject: [PATCH] fix(learn): use more precise abbreviation for ReLU (#39065) --- .../tensorflow/neural-networks-activation-functions.english.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/curriculum/challenges/english/11-machine-learning-with-python/tensorflow/neural-networks-activation-functions.english.md b/curriculum/challenges/english/11-machine-learning-with-python/tensorflow/neural-networks-activation-functions.english.md index 9639147a8c..2b18031b3b 100644 --- a/curriculum/challenges/english/11-machine-learning-with-python/tensorflow/neural-networks-activation-functions.english.md +++ b/curriculum/challenges/english/11-machine-learning-with-python/tensorflow/neural-networks-activation-functions.english.md @@ -17,7 +17,7 @@ videoId: S45tqW6BqRs question: text: Which activation function switches values between -1 and 1? answers: - - Relu (Rectified Linear Unit) + - ReLU (Rectified Linear Unit) - Tanh (Hyperbolic Tangent) - Sigmoid solution: 2