Skip to content

Update README.md #10

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
254 changes: 25 additions & 229 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,240 +1,36 @@
## Tools to Design or Visualize Architecture of Neural Network
from keras.models import Sequential
from keras.layers import Reshape, Conv2D, MaxPooling2D, Flatten, Dense, Dropout

1. [**Net2Vis**](https://viscom.net2vis.uni-ulm.de/OG1Br2BAkYSwwrV6CADl4X5EfErFjUzvuUwXWDdLbdsIXNhb9L): Net2Vis automatically generates abstract visualizations for convolutional neural networks from Keras code.
# Assuming you've loaded and preprocessed your data (X, y)

![](https://storage.googleapis.com/groundai-web-prod/media/users/user_299833/project_401171/images/application.png)
# Create a Sequential model
model = Sequential()

2. [**visualkeras**](https://github.com/paulgavrikov/visualkeras/) : Visualkeras is a Python package to help visualize Keras (either standalone or included in tensorflow) neural network architectures. It allows easy styling to fit most needs. As of now it supports layered style architecture generation which is great for CNNs (Convolutional Neural Networks) and a grap style architecture.
# Reshape input to (40, 1, 1)
model.add(Reshape((40, 1, 1), input_shape=(40, 1)))

```python
import visualkeras
# Convolutional layers
model.add(Conv2D(32, kernel_size=(3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

model = ...
model.add(Conv2D(64, kernel_size=(3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))

visualkeras.layered_view(model).show() # display using your system viewer
visualkeras.layered_view(model, to_file='output.png') # write to disk
visualkeras.layered_view(model, to_file='output.png').show() # write and show
# Flatten layer
model.add(Flatten())

visualkeras.layered_view(model)
```
# Dense layers
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.2))

![](https://github.com/paulgavrikov/visualkeras/raw/master/figures/vgg16.png)
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.2))

3. [**draw_convnet**](https://github.com/gwding/draw_convnet) : Python script for illustrating Convolutional Neural Network (ConvNet)
# Output layer
model.add(Dense(7, activation='softmax'))

![img](https://raw.githubusercontent.com/gwding/draw_convnet/master/convnet_fig.png)
# Compile the model
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

4. [**NNSVG**](http://alexlenail.me/NN-SVG/LeNet.html)

![AlexNet style](https://i.stack.imgur.com/Q0xOe.png)

![enter image description here](https://i.stack.imgur.com/K9lmg.png)

![enter image description here](https://i.stack.imgur.com/DlJ8J.png)

5. **[PlotNeuralNet](https://github.com/HarisIqbal88/PlotNeuralNet)** : Latex code for drawing neural networks for reports and presentation. Have a look into examples to see how they are made. Additionally, lets consolidate any improvements that you make and fix any bugs to help more people with this code.

![img](https://user-images.githubusercontent.com/17570785/50308846-c2231880-049c-11e9-8763-3daa1024de78.png)

![img](https://user-images.githubusercontent.com/17570785/50308873-e2eb6e00-049c-11e9-9587-9da6bdec011b.png)

6. **[Tensorboard](https://www.tensorflow.org/tensorboard/graphs)** - TensorBoard’s **Graphs dashboard** is a powerful tool for examining your TensorFlow model.

![enter image description here](https://i.stack.imgur.com/zJHpV.png)

7. **[Caffe](https://github.com/BVLC/caffe/blob/master/python/caffe/draw.py)** - In Caffe you can use [caffe/draw.py](https://github.com/BVLC/caffe/blob/master/python/caffe/draw.py) to draw the NetParameter protobuffer:

![enter image description here](https://i.stack.imgur.com/5Z1Cb.png)

8. **[Matlab](http://www.mathworks.com/help/nnet/ref/view.html)**

![enter image description here](https://i.stack.imgur.com/rPpfa.png)

9. [**Keras.js**](https://transcranial.github.io/keras-js/#/inception-v3)

![enter image description here](https://i.stack.imgur.com/vEfTv.png)

9. **[keras-sequential-ascii](https://github.com/stared/keras-sequential-ascii/)** - A library for [Keras](https://keras.io/) for investigating architectures and parameters of sequential models.

**VGG 16 Architecture**

```
OPERATION DATA DIMENSIONS WEIGHTS(N) WEIGHTS(%)

Input ##### 3 224 224
InputLayer | ------------------- 0 0.0%
##### 3 224 224
Convolution2D \|/ ------------------- 1792 0.0%
relu ##### 64 224 224
Convolution2D \|/ ------------------- 36928 0.0%
relu ##### 64 224 224
MaxPooling2D Y max ------------------- 0 0.0%
##### 64 112 112
Convolution2D \|/ ------------------- 73856 0.1%
relu ##### 128 112 112
Convolution2D \|/ ------------------- 147584 0.1%
relu ##### 128 112 112
MaxPooling2D Y max ------------------- 0 0.0%
##### 128 56 56
Convolution2D \|/ ------------------- 295168 0.2%
relu ##### 256 56 56
Convolution2D \|/ ------------------- 590080 0.4%
relu ##### 256 56 56
Convolution2D \|/ ------------------- 590080 0.4%
relu ##### 256 56 56
MaxPooling2D Y max ------------------- 0 0.0%
##### 256 28 28
Convolution2D \|/ ------------------- 1180160 0.9%
relu ##### 512 28 28
Convolution2D \|/ ------------------- 2359808 1.7%
relu ##### 512 28 28
Convolution2D \|/ ------------------- 2359808 1.7%
relu ##### 512 28 28
MaxPooling2D Y max ------------------- 0 0.0%
##### 512 14 14
Convolution2D \|/ ------------------- 2359808 1.7%
relu ##### 512 14 14
Convolution2D \|/ ------------------- 2359808 1.7%
relu ##### 512 14 14
Convolution2D \|/ ------------------- 2359808 1.7%
relu ##### 512 14 14
MaxPooling2D Y max ------------------- 0 0.0%
##### 512 7 7
Flatten ||||| ------------------- 0 0.0%
##### 25088
Dense XXXXX ------------------- 102764544 74.3%
relu ##### 4096
Dense XXXXX ------------------- 16781312 12.1%
relu ##### 4096
Dense XXXXX ------------------- 4097000 3.0%
softmax ##### 1000
```

10. **[ Netron ](https://github.com/lutzroeder/Netron)**

![screenshot.png](https://github.com/lutzroeder/netron/raw/main/.github/screenshot.png)

11. **[DotNet](https://github.com/martisak/dotnets)**

![Simple net](https://github.com/martisak/dotnets/raw/master/test.png)

12. [**Graphviz**](http://www.graphviz.org/) : **[Tutorial](https://tgmstat.wordpress.com/2013/06/12/draw-neural-network-diagrams-graphviz/)**

![img](https://tgmstat.files.wordpress.com/2013/05/multiclass_neural_network_example.png)

13. **[Keras Visualization](https://keras.io/visualization/)** - The [keras.utils.vis_utils module](https://keras.io/visualization/) provides utility functions to plot a Keras model (using graphviz)

![enter image description here](https://i.stack.imgur.com/o17GY.png)

14. **[Conx](https://conx.readthedocs.io/en/latest/index.html)** - The Python package `conx` can visualize networks with activations with the function `net.picture()` to produce SVG, PNG, or PIL Images like this:

![enter image description here](https://i.stack.imgur.com/nhHjO.png)

15. **[ENNUI](https://math.mit.edu/ennui/)** - Working on a drag-and-drop neural network visualizer (and more). Here's an example of a visualization for a LeNet-like architecture.

![A visualization of a LeNet-like architecture](https://i.stack.imgur.com/pRLeG.png)

16. **NNet - R Package** - **[Tutorial](https://beckmw.wordpress.com/2013/03/04/visualizing-neural-networks-from-the-nnet-package/)**

```
data(infert, package="datasets")
plot(neuralnet(case~parity+induced+spontaneous, infert))
```

[![neuralnet](https://i.stack.imgur.com/yyftd.png)](https://

17. **[GraphCore](https://www.graphcore.ai/posts/what-does-machine-learning-look-like)** - These approaches are more oriented towards visualizing neural network operation, however, NN architecture is also somewhat visible on the resulting diagrams.

**AlexNet**

![alexnet_label logo.jpg](https://www.graphcore.ai/hubfs/images/alexnet_label%20logo.jpg)

**ResNet50**![resnet50_label_logo.jpg](https://www.graphcore.ai/hubfs/images/resnet50_label_logo.jpg)

18. [**Neataptic**](https://wagenaartje.github.io/neataptic/ )

Neataptic offers flexible neural networks; neurons and synapses can be removed with a single line of code. No fixed architecture is required for neural networks to function at all. This flexibility allows networks to be shaped for your dataset through neuro-evolution, which is done using multiple threads.

![img](https://camo.githubusercontent.com/4326c3f603b828b61fd63d927acca2cfc152773f/68747470733a2f2f692e6779617a6f2e636f6d2f66353636643233363461663433646433613738633839323665643230346135312e706e67)

19. **[TensorSpace](https://tensorspace.org/)** : TensorSpace is a neural network 3D visualization framework built by TensorFlow.js, Three.js and Tween.js. TensorSpace provides Layer APIs to build deep learning layers, load pre-trained models, and generate a 3D visualization in the browser. By applying TensorSpace API, it is more intuitive to visualize and understand any pre-trained models built by TensorFlow, Keras, TensorFlow.js, etc.

**[Tutorial](https://www.freecodecamp.org/news/tensorspace-js-a-way-to-3d-visualize-neural-networks-in-browsers-2c0afd7648a8/)**

![enter image description here](https://i.stack.imgur.com/ekF5v.png)


20. **[Netscope CNN Analyzer](http://dgschwend.github.io/netscope/quickstart.html)**

![enter image description here](https://i.stack.imgur.com/VVDsg.png)

21. **[Monial](https://github.com/mlajtos/moniel)**

Interactive Notation for Computational Graphs https://mlajtos.github.io/moniel/

![img](https://miro.medium.com/max/819/1*u6uIQF4xTVe-ylJnAPoIDg.png)

22. [**Texample**](http://www.texample.net/tikz/examples/neural-network/)

![Neural Network](https://texample.net/media/tikz/examples/PNG/neural-network.png)


```
\documentclass{article}

\usepackage{tikz}
\begin{document}
\pagestyle{empty}

\def\layersep{2.5cm}

\begin{tikzpicture}[shorten >=1pt,->,draw=black!50, node distance=\layersep]
\tikzstyle{every pin edge}=[<-,shorten <=1pt]
\tikzstyle{neuron}=[circle,fill=black!25,minimum size=17pt,inner sep=0pt]
\tikzstyle{input neuron}=[neuron, fill=green!50];
\tikzstyle{output neuron}=[neuron, fill=red!50];
\tikzstyle{hidden neuron}=[neuron, fill=blue!50];
\tikzstyle{annot} = [text width=4em, text centered]

% Draw the input layer nodes
\foreach \name / \y in {1,...,4}
% This is the same as writing \foreach \name / \y in {1/1,2/2,3/3,4/4}
\node[input neuron, pin=left:Input \#\y] (I-\name) at (0,-\y) {};

% Draw the hidden layer nodes
\foreach \name / \y in {1,...,5}
\path[yshift=0.5cm]
node[hidden neuron] (H-\name) at (\layersep,-\y cm) {};

% Draw the output layer node
\node[output neuron,pin={[pin edge={->}]right:Output}, right of=H-3] (O) {};

% Connect every node in the input layer with every node in the
% hidden layer.
\foreach \source in {1,...,4}
\foreach \dest in {1,...,5}
\path (I-\source) edge (H-\dest);

% Connect every node in the hidden layer with the output layer
\foreach \source in {1,...,5}
\path (H-\source) edge (O);

% Annotate the layers
\node[annot,above of=H-1, node distance=1cm] (hl) {Hidden layer};
\node[annot,left of=hl] {Input layer};
\node[annot,right of=hl] {Output layer};
\end{tikzpicture}
% End of code
\end{document}
```

23. [**Quiver**](https://github.com/keplr-io/quiver)

![gzqll3](https://cloud.githubusercontent.com/assets/5866348/20253975/f3d56f14-a9e4-11e6-9693-9873a18df5d3.gif)

**References :**

1) https://datascience.stackexchange.com/questions/12851/how-do-you-visualize-neural-network-architectures

2) https://datascience.stackexchange.com/questions/2670/visualizing-deep-neural-network-training
# Print model summary
model.summary()