From 83cfea8952a1e537fb958f893399be4d8f1f5b46 Mon Sep 17 00:00:00 2001 From: MatejStraka Date: Wed, 14 Nov 2018 05:18:40 +0100 Subject: [PATCH] typo fix (#25627) --- .../machine-learning/deep-learning/gradient-descent/index.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/guide/english/machine-learning/deep-learning/gradient-descent/index.md b/guide/english/machine-learning/deep-learning/gradient-descent/index.md index 862822e563..c2dff760ad 100644 --- a/guide/english/machine-learning/deep-learning/gradient-descent/index.md +++ b/guide/english/machine-learning/deep-learning/gradient-descent/index.md @@ -20,10 +20,10 @@ This is where feature scaling, also called normalization, comes in handy, to mak Machine learning problems usually requires computations over a sample size in the millions, and that could be very computationally intensive. -In stochastic gradient descent you update the the parameter for the cost gradient of each example rather that the sum of the cost gradient of all the examples. You could arrive at a set of good parameters faster after only a few passes through the training examples, thus the learning is faster as well. +In stochastic gradient descent you update the parameter for the cost gradient of each example rather that the sum of the cost gradient of all the examples. You could arrive at a set of good parameters faster after only a few passes through the training examples, thus the learning is faster as well. ### Further Reading * [A guide to Neural Networks and Deep Learning](http://neuralnetworksanddeeplearning.com/) * [Gradient Descent For Machine Learning](https://machinelearningmastery.com/gradient-descent-for-machine-learning/) -* [Difference between Batch Gradient Descent and Stochastic Gradient Descent](https://towardsdatascience.com/difference-between-batch-gradient-descent-and-stochastic-gradient-descent-1187f1291aa1) \ No newline at end of file +* [Difference between Batch Gradient Descent and Stochastic Gradient Descent](https://towardsdatascience.com/difference-between-batch-gradient-descent-and-stochastic-gradient-descent-1187f1291aa1)