I am a PhD student in the Machine Learning group at the University of Waikato. I'm mostly interested in improving the general state of deep learning, and have a particular focus in applications relevant to robot vision.
MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes
(Accepted at ECML-PKDD 2018)
An extension of the Lipschitz continuity based regularisation scheme that uses a data-driven approximation to Lipschitz continuity. We also present the SINS-10 dataset in this paper, which is intended to make significance testing easier when comparing convolutional Neural network classifiers.
Regularisation of Neural Networks by Enforcing Lipschitz Continuity
This paper presents a technique for computing the Lipschitz constant of a network, and each individual layer in the network, and shows that constraining them acts as an effective regularisation technique. Results are presented on standard image classification datasets showing that it is competitive with other methods, such as dropout and batchnorm. Moreover, it can be combined with these methods to achieve greater performance.