Regularization in Logistic Regression: Same Idea, Different Function

1 min read

Well, still similar to the Regularization applied to linear, the regularized cost function for logistic regression is defined by:
$$J(\mathbf{w}, b) = -\frac{1}{m} \sum_{i=1}^m \left[ y^{(i)} \log(f_{\mathbf{w}, b}(\mathbf{x}^{(i)})) + (1 - y^{(i)}) \log(1 - f_{\mathbf{w}, b}(\mathbf{x}^{(i)})) \right] + \frac{\lambda}{2m} \sum_{j=1}^n w_j^2$$
For the gradient descent, it's similar to the linear regression. we just have to remember that the definition of f(x) is different for logistic.
Now, Onwards to learning Advanced algorithms. see you next time
0
Subscribe to my newsletter
Read articles from Paul Omagbemi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
