понедельник, 29 июля 2019 г.

Coursera ML - Implementing regularized logistic regression cost function in python - Stack Overflow

Logistic regression andrew ng.

Cost function for logistic regression - Stack Overflow
I am working through Andrew Ng's Machine Learning on Coursera by implementing all the code in python rather than MATLAB. In Programming Exercise 3, I implemented my regularized logistic regression cost function in a vectorized form: On the following test inputs: the above cost function outputs 3.734819396109744 . However, according to the skeleton MATLAB code provided to us, the correct output should be 2.534819 . I'm puzzled because I cannot find anything wrong with my cost function but I believe it has a bug.
Coursera ML - Implementing regularized logistic regression cost function in python - Stack Overflow
In fact, I've also implemented it in Programming Exercise 2 in the binary classification case and it works fine, giving a result close to the expected value. I thought that one reason could be that I've constructed my *_test input arrays wrongly based on misinterpreting the provided skeleton MATLAB code which are: However, I had ran them through an Octave interpreter to see what they actually are, and ensure that I could match them exactly in python. Furthermore, the computation of gradient based on these inputs using my own vectorized and regularized gradient function is also correct. Lastly, I decided to just proceed with the computation and examine the prediction results. The accuracy of my predictions were way lower than the expected accuracy, so it gives all the more reason to suspect that something is wrong with my cost function that is making everything else wrong.

Комментариев нет:

Отправить комментарий