This project is dead. Maybe I'll touch it again in the future. Probably not.
This is a Toy Project for understanding a Full Connected Forward Feed Neural Network (FFNN) as well as implement back-propogation algorithm. I constructed a this FFNN using only matrix operations from numpy (Matrix Products, vector dotproducts, vector outer products. It was important for me to gather a better mathematical idea of the FFNN and Back-Propogation algorithm prior to using better /optimized tools. I.e TensorFlow, PyTorch, etc.
To update:
-Update indices for backpropogation algorithm for Stochastic Gradient Descent (SGD) for improved readability
-Completely Classify FFNN following E Fiesler's paper "Neural Network Classification and Formalization"