Logistic regression cost function.
I am going over the lectures on Machine Learning at Coursera. I am struggling with the following. How can the partial derivative of.
where $h_ (x)$ is defined as follows. In other words, how would we go about calculating the partial derivative with respect to $\theta$ of the cost function (the logs are natural logarithms): The reason is the following. We use the notation. $$\theta x^i:=\theta_0+\theta_1 x^i_1+\dots+\theta_p x^i_p. $$ Since our original cost function is the form of: Plugging in the two simplified expressions above, we obtain $$J(\theta)=-\frac \sum_ ^m \left[-y^i(\log ( 1+e^ )) + (1-y^i)(-\theta x^i-\log ( 1+e^ ))\right]$$, which can be simplified to: $$J(\theta)=-\frac \sum_ ^m \left[y_i\theta x^i-\theta x^i-\log(1+e^ )\right]=-\frac \sum_ ^m \left[y_i\theta x^i-\log(1+e^ )\right],
Комментариев нет:
Отправить комментарий