Skip to content

Latest commit

 

History

History
36 lines (27 loc) · 1.48 KB

README.md

File metadata and controls

36 lines (27 loc) · 1.48 KB

Neural network optimizaton project AI_unibo_2021

Simple neural network implementation, based on KMNIST (japanese ideograms) classification. net.py contains all the necessary functions. loader.py load the kmnist dataset and applies basic data transformations.

Functionalities

Vanilla network

  1. backpropagation algorithm with SGD
  2. basic sigmoid activation function
  3. quadratic cost function
  4. random gaussian weight inizialization

Improvements

  1. weight inizialization
  2. L2 regularization
  3. cross entropy cost function
  4. learning rate scheduler?
  5. momentum based SGD?

Other features

  1. animation showing the change in bias and weights
  2. basic train evaluation tools: time, accuracy
  3. basic hyperparamenter tuning with graph equalizing time, random initializaions
  4. shows vanishing gradient problem

Backprop algorithm visualization

backprop

Examples of weight and bias animation

The program saves mid-training weights and biases for visualization purposes. The animation shows how the network change its parameters with gradient descent. It is possible to show the vanishing gradient problem by inspecting the first layer.

weight

bias