This repository contains Python code that generates visualizations for various activation functions commonly used in neural networks. The activation functions included are:
- Linear Activation Function
- Sigmoid Activation Function
- Hyperbolic Tangent (Tanh) Activation Function
- Rectified Linear Unit (ReLU) Activation Function
- Leaky ReLU Activation Function
- Softmax Activation Function
These visualizations are useful for understanding how different activation functions behave with respect to input values. They help in visualizing their properties, such as non-linearity, saturation regions, and how they are used in neural network architectures.
To run the code, you need to have the following Python packages installed:
numpy
matplotlib
You can install them using pip:
pip install numpy matplotlib