Deep Learning: A Beginner's Guide


History of Deep learning
Before learning about Deep Learning we should now the history behind it ,that with how much effort this thing has changed its course from nothing to every thing.
In 1940s-1950s: The Birth of Neural Network
1943-McCulloch & Pitts proposed the first simplified computaional model of Neuron
1958 - Frank Rosenblatt introduced the first perceptron, and also the first program was binary classification.
In 1960s-1980s:First Winter
1969 - Minsky & Papert showed that Peerceptron can’t solve XOR(simple problem)
Al and Neural Network fell out of favor as this reason was creating problem and it fell for a decade ,killing all the hype.This is called the first Winter.
In 1986-1990s:Backpropogation
1986 - Rumelhart ,Hinton ,& Williams popularized backpropogation , allowing multi-layer network(MLP) to be trained.
1989-1998- Yann LeCun created LeNet ,a CNN used for text recognation in checks.
These all program were good and gave really good output but there wasn’t good large data set and Compute problem.
2000s:Data , GPUs,and Patience
More data +better Hardware especially GPUs which were outstanding
reasercher kept pushing deep nets while the mainstream focused on SVMs and decision trees.
2012 : The Breakthrough
AlexNet(Krizhevsky,Sutskevar,Hinton ) Won the ImageNet Compitition , breaking the traditional method by huge margin .
This was the main breakthrough as CNN was recognized by many researcher.
2013-2018:The DeepLearning Boom
2014- GAN(genrative Adversial Networks) introduced by Ian Goodfellow
2015 - ResNet won ImageNet with 152 networks, enabling deeper model
2016 - Transformer model were introduced powering model like GPT, Bert.
I think This Much history is enough to know why and how it cam from nothing to everything .I also dont like making the article this much so ill only concluded what is import in all this series.
DeepLearning is an really important an facinating Technology ,So i hope You likedd the History about it.
Now lets Move ahead:
What is Neuron?
Before Knowing this we should know that Perceptron (Artificial Neuron) is developed by researching Human Brain.How a human brain function ,there are so many electrical impulses continuely going on inside our brain .By taking node a perceptron was created and inside it Calculation are going on .
Different Parts in Neuron are :
Nucleus
axon
dendrite
synopses
here synopses are the impulses between diff rent neuron.Reasercher figured out that when multiple neuron works together it creates reaction .
So We Know what is Neuron is fundamental block of human brain.
What is perceptron?
As we told above that perceptron is derived from biological neuron.
Perceptron can be single or MLP(multi layer Perceptron) where multiplle neuron are connected with each other.The connection are Weights.
It contain Weights W,and one Bias value (B) .Weights tells us the importance of its input how much value it carries.
$$Z=w.x+b$$
Here Z=output
w=weights
x=input feature
b=bias
Here the equation is then go through a Activation Function ,Now you will ask what is Activation Function it is mathematical function which on the basis of input value sets output there are different type of function i.e sigmoid,ReLU,softmax and many more which we learn in next articles
So thats for today we continue in next article
Couclusion:
The article traces the evolution of deep learning from its inception in the 1940s with the creation of the first neural network models, through the challenges and breakthroughs over the decades, to the present day. It highlights key developments such as the introduction of the perceptron, the popularization of backpropagation, and significant advancements like AlexNet and GANs. The text emphasizes the importance of understanding the fundamental concepts of neural networks, including neurons and perceptrons, and their biological inspirations.
Subscribe to my newsletter
Read articles from Kalpesh Patil directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
