The assignment was to create a Neural Network from scratch. I chose to implement the net in Python, using the numpy package for matrix manipulations. My net achieved ~98% accuracy on the MNIST dataset, which is composed of images of numbers.

For the assignment, I derived the backpropagation calculations for a net with layers of size 784 ➡️ 200 ➡️ 10, with tanh, then sigmoid as the activation functions.

To extend the assignment, I derived and implemented a general backpropagation algorithm that works with any specified number of hidden layers and units. I also implemented gradient checking via finite differences to help check the backprop during debugging.

Here is the hw6 specification.