"Deep Dive into Neural Networks: Understanding the Basics and Beyond"

Sujit NirmalSujit Nirmal
2 min read

Introduction

Neural networks are the backbone of modern machine learning. In this blog, we'll explore the fundamentals of neural networks, their architecture, and how they work. We'll also dive into some advanced concepts and provide hands-on examples to solidify your understanding.

What is a Neural Network?

  • Definition: A neural network is a series of algorithms that attempt to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates.

  • Components: Neurons, layers (input, hidden, output), weights, biases, activation functions.

Basic Architecture

  • Input Layer: Receives the input data.

  • Hidden Layers: Perform computations and feature extraction.

  • Output Layer: Produces the final output.

Activation Functions

  • Sigmoid: ( \sigma(x) = \frac{1}{1 + e^{-x}} )

  • ReLU: ( f(x) = \max(0, x) )

  • Tanh: ( \tanh(x) = \frac{2}{1 + e^{-2x}} - 1 )

Hands-On Example: Building a Simple Neural Network with Python

import numpy as np

# Sigmoid activation function
def sigmoid(x):
    return 1 / (1 + np.exp(-x))

# Derivative of the sigmoid function
def sigmoid_derivative(x):
    return x * (1 - x)

# Input dataset
inputs = np.array([[0, 0],
                   [0, 1],
                   [1, 0],
                   [1, 1]])

# Output dataset
outputs = np.array([[0], [1], [1], [0]])

# Seed for random number generation
np.random.seed(1)

# Initialize weights randomly with mean 0
weights = 2 * np.random.random((2, 1)) - 1

# Training the neural network
for iteration in range(10000):
    # Forward propagation
    input_layer = inputs
    outputs_pred = sigmoid(np.dot(input_layer, weights))

    # Calculate the error
    error = outputs - outputs_pred

    # Backpropagation
    adjustments = error * sigmoid_derivative(outputs_pred)
    weights += np.dot(input_layer.T, adjustments)

print("Trained weights after 10,000 iterations:")
print(weights)
print("Output after training:")
print(outputs_pred)

Advanced Concepts

  • Backpropagation: The process of adjusting weights to minimize the error.

  • Gradient Descent: An optimization algorithm to find the minimum of a function.

Study Material

Happy Coding !!

Happy Coding Inferno !!

Happy Learning !!

0
Subscribe to my newsletter

Read articles from Sujit Nirmal directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Sujit Nirmal
Sujit Nirmal

๐Ÿ‘‹ Hi there! I'm Sujit Nirmal, a AI /M:L Developer with a passion for creating intelligent, seamless M L applications. With a strong foundation in both machine learning and Deep Learning I thrive at the intersection of data and technology.