I got into Machine Learning in January 2024, and since then, it’s been a constant process of learning, experimenting, and building. I started with CS229 by Andrew Ng and watched the lecture series, worked on a few Kaggle projects, and took courses like ML in IoT, Privacy-Preserving ML, and Data Mining as part of my master's coursework.
One of the most exciting parts of my journey has been my master’s project, which started as a custom RNN in Scala and eventually turned into building an autograd system from scratch. I enjoy implementing things myself to deepen my understanding—like building an autograd system in Scala instead of just using PyTorch’s Autograd.
Right now, I’m diving into Deep Learning & NLP, working through Stanford CS224N and the Deep Learning Specialization to strengthen my understanding of modern architectures. I’ve also been documenting my learning process by taking CS224N notes and solving assignments on my own.
Lately, I’ve been exploring Federated Sustainable Learning through research at UGA with a team. We're currently surveying the carbon footprint of different models on Jetson devices, which has been particularly exciting and interesting.
💬 Always happy to chat about ML, NLP, and Federated Learning! Open to collaborations, discussions, and new ideas—feel free to reach out!