I'm a senior applied scientist at Oracle, where I work on applying large language models to challenging problems in healthcare. Previously, I worked at Amazon, developing a foundation model for 3D computer vision. I completed my PhD at Victoria University of Wellington (VUW) in New Zealand, where my research focused on meta-learning loss functions for deep neural networks. My current research interests include meta-learning, meta-optimization, hyperparameter optimization, few-shot learning, and continual learning. For students with prior publication experience in meta-learning or related areas, I am open to co-supervision of masters and PhD projects through VUW with Dr Qi Chen and Prof Bing Xue. If you are interested please do not hesitate to contact me.
-
Oracle โ Health and AI
- Melbourne, Australia
- decadz.github.io/
Highlights
- Pro
Pinned Loading
-
Evolved-Model-Agnostic-Loss
Evolved-Model-Agnostic-Loss PublicPyTorch code for the EvoMAL algorithm presented in "Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning" (TPAMI-2023). Paper Link: https://arxiv.org/abs/2209.08907
Python 15
-
Sparse-Label-Smoothing-Regularization
Sparse-Label-Smoothing-Regularization PublicPyTorch code for Sparse Label Smoothing Regularization presented in "Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning" (TPAMI-2023). Paper Link: https://arxiv.org/abs/2209.08907
Python 2
-
Meta-Learning-Literature-Overview
Meta-Learning-Literature-Overview PublicList of AI/ML papers related to my thesis on "Meta-Learning Loss Functions for Deep Neural Networks". Thesis link: https://arxiv.org/abs/2406.09713
-
Genetic-Programming-with-Rademacher-Complexity
Genetic-Programming-with-Rademacher-Complexity PublicPython code for the GP-RC algorithm presented in "Genetic Programming with Rademacher Complexity for Symbolic Regression" (CEC-2019). Paper Link: https://ieeexplore.ieee.org/document/8790341
If the problem persists, check the GitHub status page or contact support.