Posts

Showing posts from November, 2025

Week 13 & 14

Image
Coursework Neural Networks and Deep Learning: This week was basically a deep learning bootcamp. We started with the fundamentals in TensorFlow/Keras. How the Sequential API works, how layers stack, and why activation functions matter. I built an ANN on MNIST with exactly three dense layers, a Flatten layer, and dropout. Then I visualized everything: raw images, training/testing splits, the first and last epochs, loss/accuracy curves, and a final confusion matrix to see where the model stumbled. Check out the code I wrote here ! After that, we moved into CNNs. This part really forced me to understand image shapes, filters, cross-correlation, max pooling, and flattening by actually illustrating every step. It made the CNN architecture much easier to understand. Check out the code I wrote  here ! Next came RNNs and LSTMs. I trained both on a sequential/text dataset and compared how they handled dependencies. The LSTM’s ability to remember longer patterns felt very real once I saw the ...

Week 11 & 12

Image
The last two weeks were mostly midterms and a lot of learning in class.