-
Notifications
You must be signed in to change notification settings - Fork 19
Neural Networks
Our workflow should be like that of Keras and not like the present one -
Here is the Sequential
model:
from keras.models import Sequential
model = Sequential()
Stacking layers is as easy as .add()
:
from keras.layers import Dense
model.add(Dense(units=64, activation='relu', input_dim=100))
model.add(Dense(units=10, activation='softmax'))
Once your model looks good, configure its learning process with .compile()
:
model.compile(loss='categorical_crossentropy',
optimizer='sgd',
metrics=['accuracy'])
(taken from its readme)
cs231n Winter Lecture 4 Neural Networks 1
Yes you should understand backprop
a Hands-On Tutorial with Caffe
https://github.com/dennybritz/nn-from-scratch/blob/master/nn-from-scratch.ipynb
http://pages.cs.wisc.edu/~dpage/cs760/ANNs.pdf
https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/
- Larger batch sizes are good but taking absurdly large batch size will result in fewer weight updates. So don't be absurd.
TODO:
-
Gradient check -
https://imaddabbura.github.io/blog/machine%20learning/deep%20learning/2018/04/08/coding-neural-network-gradient-checking.html
http://cs231n.github.io/optimization-1/#analytic
https://youtu.be/i94OvYb6noo?list=PLkt2uSq6rBVctENoVBg1TpCC7OQi31AlC&t=170
https://github.com/Kulbear/deep-learning-coursera/blob/master/Improving%20Deep%20Neural%20Networks%20Hyperparameter%20tuning%2C%20Regularization%20and%20Optimization/Gradient%20Checking.ipynb -
Weight initialization - cs231n Lec 5 NN