#77 Machine Learning & Data Science Challenge 77

What is RESNET?

The winner of ILSRVC 2015, also called Residual Neural Network (ResNet) by Kaiming.

  • This architecture introduced a concept called “skip connections”.

  • Typically, the input matrix calculates in two linear transformations with the ReLU activation function. In the Residual network, it directly copies the input matrix to the second transformation output and sums the output in the final ReLU function.

Skip Connection

  • Experiments in paper four can judge the power of the residual network. The plain 34-layer network had high validation errors than the 18 layers plain network.

  • This is where we realize the degradation problems. And the same 34 layers network when converted to the residual network has much less training error than the 18 layers residual network.

0
Subscribe to my newsletter

Read articles from Bhagirath Deshani directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Bhagirath Deshani
Bhagirath Deshani

Hello everyone! I am Machine Learning Engineer. I am from India. I have been interested in machine learning since my engineering days. I have completed Andrew NG’s original Machine Learning course from Stanford University at Coursera and also completed the IBM course on Machine Learning and Deep Learning. Currently, I am working on Machine Learning and Data Science project. My goal is to use the skills I have acquired to solve real-world problems and make a positive impact on the world.