Syllabus Point
- Apply neural network models using an OOP to make predictions
Apply neural network models using an OOP to make predictions
Neural Networks
Neural networks are a set of algorithms designed to recognise patterns, by mimicking the way the human brain works. It is suitable for complex problems involving large, unstructured data.
Overview
- Have series of interconnected nodes (artificial neurons)
- Each neuron processes input, applies a function and passes the result forward
- Series of algorithms that recognise relationships in data; commonly used in deep learning to solve complex problems
Architecture
<!-- IMAGE: Neural Network Architecture Diagram showing Input Layer, Hidden Layers, and Output Layer -->
A neural network consists of neurons, which receive inputs, process them and produce an output:
Input Layer
- Receives raw data
- Each neuron in the layer represents a feature of the input data
- Example: grid of pixels
Hidden Layers
- Perform computations and transformations on the input data
- The more layers, the deeper the network (hence 'deep learning')
- Inside the network, there are connections between artificial neurons like tiny switches
- Connections have weights (how important each input is) and thresholds (how strong a signal must be to activate the next neuron)
Output Layer
- Produces the final result/prediction
- Each neuron corresponds to a possible output class
Adjustable Parameters
Adjustable parameters within neurons are called weights and biases. As the network learns these are adjusted to determine the strength of input signals.
Weights
Determine the strength of connections between neurons.
Biases
Provide an additional parameter that shift the activation function's output.
Activation Function
Decides whether a neuron should be activated based on the weighted sum of its inputs and a bias.
Training Cycle
The training cycle is when the neural network learns from data.
Steps
- Input data is fed in
- Network makes prediction
- Prediction compared to correct answer (called a label)
- Error is calculated (how far off)
- Network uses backpropagation to adjust its internal settings (weights)
- Process repeats (iteration) over many epochs (passes over the data) to reduce errors
Backpropagation
Backward chaining (backpropagation): method where the network works backward from the output error to adjust its internal weights.
Goal
The goal is to help the network learn to make accurate predictions by adjusting its parameters based on example and feedback.
Forward and Backward Pass
<!-- IMAGE: Forward Pass and Backward Pass Diagram -->
Execution Cycle
The execution cycle is when the trained neural network is used to make real-world decisions (also known as inference).
Steps
- New input is fed into the network
- Network runs its learned pattern-matching process
- Outputs a prediction or result
Importance of Neural Networks
- Automating tasks to save time and money
- Improved decision making with better insights
- Increased efficiency
- New products and services
Applications of Neural Networks
- Image recognition
- Natural language processing (speech recognition)
- Time series prediction (e.g. financial forecasting)
Related Resources
Keep Progressing
Use the lesson navigation below to move through the module sequence.