Skip to content

This repository contains Python code that generates visualizations for various activation functions commonly used in neural networks.

Notifications You must be signed in to change notification settings

Atefeh97hmt/Activation-Functions-Visualization

Repository files navigation

Activation Functions Visualization

This repository contains Python code that generates visualizations for various activation functions commonly used in neural networks. The activation functions included are:

  1. Linear Activation Function
  2. Sigmoid Activation Function
  3. Hyperbolic Tangent (Tanh) Activation Function
  4. Rectified Linear Unit (ReLU) Activation Function
  5. Leaky ReLU Activation Function
  6. Softmax Activation Function

These visualizations are useful for understanding how different activation functions behave with respect to input values. They help in visualizing their properties, such as non-linearity, saturation regions, and how they are used in neural network architectures.

Prerequisites:

To run the code, you need to have the following Python packages installed:

  • numpy
  • matplotlib

You can install them using pip:

pip install numpy matplotlib

About

This repository contains Python code that generates visualizations for various activation functions commonly used in neural networks.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages