edit(guide): simplify cost function (#29668)

Simplify the cost function for logistic regression
This commit is contained in:
Varian Caesar
2019-02-15 03:45:14 +07:00
committed by Randell Dawson
parent a83143edb7
commit c900cec85e

View File

@ -31,10 +31,13 @@ J(θ)=(1/m)∑Cost(hθ(x(i)),y(i)) , where summation is from i=1 to m.
Where (x) is = hypothetic value calculated in accordance with attributes and weights which are calculated and balanced via algorithm such as gradient descent.
y = is the corresponding value from observation data set
Here cost function is not a proper sigmoid function in use but in place, two log functions which performs with greater efficiency without
penalizing the learning algorithms are used.
Cost((x),y)=log((x)) if y = 1
Cost((x),y)=log(1(x)) if y = 0
Here cost function is not a proper sigmoid function in use but in place, two log functions which performs with greater efficieny without penalizing the learning algorithms are used.
* Cost((x),y)=log((x)) if y = 1
* Cost((x),y)=log(1(x)) if y = 0
Which we can simplify as:
Cost((x),y) = log((x)) log(1 (x))
Refer to this article for clearing your basics https://www.analyticsvidhya.com/blog/2017/06/a-comprehensive-guide-for-linear-ridge-and-lasso-regression/