Shrink the Weights, Save the Model: A Quick Look at Regularization

1 min read

So, to simply put, Regularization is just simply making your parameters smaller and relatively less effective to avoid overfitting. oh, and the beautiful thing? — you get to keep all your features because you don't know which one might just make the difference at the end of the day. With regularization we penalize the model by making the value of parameter w really small. Conventionally we ignore b. Although, even if we choose to penalize b, it makes no difference.
0
Subscribe to my newsletter
Read articles from Paul Omagbemi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
