Chapter 7: Neural Networks

 

Chapter 7: Neural Networks

🧠 Neural Networks are inspired by the human brain and form the core of Deep Learning.
They can learn complex patterns and power everything from image recognition to language models like ChatGPT.


🧠 What is a Neural Network?

A Neural Network is a series of connected layers of nodes (neurons) that transform input data to an output through mathematical operations.


🔹 1. Structure of a Neural Network

css
[Input Layer][Hidden Layers][Output Layer]
  • Input Layer: Takes features (e.g., pixels, words)

  • Hidden Layers: Do computations (deep = many layers)

  • Output Layer: Gives prediction (class, value)


🔗 Each neuron performs:

y=Activation(Wx+b)y = \text{Activation}(W \cdot x + b)
  • x = input

  • W = weight

  • b = bias

  • Activation = nonlinear function


🔹 2. Types of Neural Networks

TypePurposeExample Use
PerceptronBasic unit (1 neuron)Logical operations
Multilayer Perceptron (MLP)Fully connected layersTabular data, basic tasks
Convolutional NN (CNN)ImagesFace detection, X-rays
Recurrent NN (RNN)SequencesText, speech, time-series

🔹 3. Activation Functions

Activation functions decide whether a neuron should fire or not (nonlinearity is essential!)

FunctionFormulaUsage
Sigmoid11+ex\frac{1}{1+e^{-x}}Outputs between 0 and 1
ReLUmax(0,x)\max(0, x)Fast, used in hidden layers
Tanhexexex+ex\frac{e^x - e^{-x}}{e^x + e^{-x}}Outputs between -1 and 1
SoftmaxConverts scores to probabilitiesUsed in final classification layer

📉 4. Loss Functions

Measures how far the predicted output is from the actual value.

Loss FunctionUsed ForFormula (Concept)
MSERegressionMean Squared Error
Cross-EntropyClassificationLog loss of predicted probability

🔄 5. Backpropagation & Gradient Descent

The learning process of a neural network.

  • Forward Pass: Compute prediction

  • Loss Computation

  • Backward Pass (Backpropagation): Calculate gradients

  • Gradient Descent: Update weights to reduce error

Formula:

W=WαLWW = W - \alpha \cdot \frac{\partial L}{\partial W}
  • α\alpha: learning rate

  • LW\frac{\partial L}{\partial W}: gradient of loss wrt weights


🧪 6. Training a Neural Network

Steps:

  1. Initialize weights

  2. Feed input (Forward Pass)

  3. Calculate loss

  4. Backpropagate to get gradients

  5. Update weights

  6. Repeat for many epochs (iterations)


📌 Neural Network Terminology

TermMeaning
EpochOne full pass over training data
Batch SizeNumber of samples per update
Learning RateHow fast weights are updated
OverfittingModel fits training too well
RegularizationPrevent overfitting (e.g., dropout, L2)

💻 Example (Simple MLP with Keras):

python
from keras.models import Sequential from keras.layers import Dense model = Sequential() model.add(Dense(16, input_shape=(10,), activation='relu')) model.add(Dense(1, activation='sigmoid')) model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy']) model.fit(X_train, y_train, epochs=20, batch_size=32)

💡 Where Neural Networks Are Used:

  • Facial recognition

  • Voice assistants (Siri, Alexa)

  • ChatGPT / BERT / LLMs

  • Medical diagnosis

  • Fraud detection


🧠 Summary of Chapter 7

ConceptDescription
Neural NetworksSimulate brain-like learning
LayersInput ➝ Hidden ➝ Output
ActivationAdds nonlinearity
Loss & GradientUsed for learning
BackpropagationOptimizes weights
ToolsTensorFlow, Keras, PyTorch

✅ Mini Assignment

  1. Build a neural network using Keras to classify MNIST digits.

  2. Try different activation functions and observe training accuracy.

  3. Implement a 2-layer MLP from scratch using NumPy.

homeacademy

Home academy is JK's First e-learning platform started by Er. Afzal Malik For Competitive examination and Academics K12. We have true desire to serve to society by way of making educational content easy . We are expertise in STEM We conduct workshops in schools Deals with Science Engineering Projects . We also Write Thesis for your Research Work in Physics Chemistry Biology Mechanical engineering Robotics Nanotechnology Material Science Industrial Engineering Spectroscopy Automotive technology ,We write Content For Coaching Centers also infohomeacademy786@gmail.com

إرسال تعليق (0)
أحدث أقدم