среда, 7 августа 2019 г.

How is the cost function from Logistic Regression derivated - Cross Validated

Logistic regression cost function. I am doing the Machine Learning Stanford course on Coursera. In the chapter on Logistic Regression, the cost function is this: Then, it is derivated here: I tried getting the derivative of the cost function but I got something completely different. How is the derivative obtained? Which are the intermediary steps? Adapted from the notes in the course, which I don't see available (including this derivation) outside the notes contributed by students within the page of Andrew Ng's Coursera Machine Learning course. In what follows, the superscript $(i)$ denotes individual measurements or training "examples." The derivative of the sigmoid function is. (Ax)$ ? – hxd1011 May 11 '17 at 1:46. To avoid impression of excessive complexity of the matter, let us just see the structure of solution. With simplification and some abuse of notation, let $G(\theta)$ be a term in sum of $J(\theta)$, and $h = 1/(1+e^ )$ is a function of $z(\theta)= x \theta $: $$ G = y \cdot \log(h)+(1-y)\cdot \log(1-h) $$ We may use chain rule: $\frac =\frac \frac \frac $ and solve it one by one ($x$ and $y$ are constants). $$\frac = \frac - \frac = \frac $$ For sigmoid $\frac = h (1-h) $ holds, which is just a denominator of the previous statement. Combining results all together gives sought-for expression: $$\frac = (y-h)x $$ Hope that helps.

Комментариев нет:

Отправить комментарий