Skip to content

A neural network with a fully connected layer and auto-backpropagation, implemented from scratch.

Notifications You must be signed in to change notification settings

wyhong3103/neural-network-from-scratch

Repository files navigation

Neural Network From Scratch

I challenged myself to write an artificial neural network from scratch that supports a fully connected layer with ReLU/Sigmoid/Linear activation functions. It's an unoptimized neural network. It runs slowly, but it works fine.

Here is a note I made about backpropagation. Feel free to check it out if you'd like.

The most interesting part of this project is figuring out the implementation of backpropagation. It's like solving a CP problem, quite fun. I never thought I'd be implementing topological sort in a project. I guess CP wasn't a waste after all! :D

Instruction

I'm not sure who will use it, but if you're interested, here are the instructions.

  1. Import the neural network, the layers and the loss functions
from layers.fc_layer import FC_Layer
from loss.squared_error import squared_error
from loss.binary_crossentropy import binary_crossentropy
from neural_network import NN
from layers.input_layer import Input_Layer
  1. Define the neural network.
nn = NN([
    Input_Layer(1),
    FC_Layer(5, 'relu'),
    FC_Layer(5, 'relu'),
    FC_Layer(1, 'relu')
], loss_fn=squared_error, learning_rate=0.0001)
  1. Train it.
X = [[i * 1.0] for i in range(1, 20)]
y = [i*5*1.0 for i in range(1, 20)]

for _ in range(100):
    nn.fit(X, y)
  1. Predict.
nn.predict(X)

Binary Handwritten Digit Classification Example

I also tried using the neural network to recognize handwritten digits 1 and 0. The jupyter notebook can be found here.

References

About

A neural network with a fully connected layer and auto-backpropagation, implemented from scratch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published