Function Shaping in Deep Learning
Ēvalds Urtāns

01.12.2021. 14:30,

Agris Ņikitenko

Jānis Grabis, William Sayers, Houxiang Zhang

The design of loss functions for deep learning methods is attracting growing attention because empirically found loss functions have achieved better results than commonly used loss functions that were analytically derived from mathematical theory. This work describes the importance of loss functions and related methods for deep reinforcement learning and deep metric learning. A novel MDQN loss function outperformed DDQN loss function in PLE computer game environments, and a novel Exponential Triplet loss function outperformed the Triplet loss function in the face re-identification task with VGGFace2 dataset reaching 85.7% accuracy using zero-shot setting. This work also presents a novel UNet-RNN-Skip model to improve the performance of the value function for path planning tasks. It has the same policy outcome as the Value Iteration algorithm for 99.8\% of the cases and can be trained on 32x32 maps, but then applied to larger maps like 256x256. Novel approaches have been usefully applied in multiple commercial applications for voice and face re-identification, audio signal denoising, and chromatography.

Deep Learning, Triplet Loss, Deep Metric Learning, Deep Reinforcement Learning, Loss Function, Zero-shot learning

Urtāns, Ēvalds. Function Shaping in Deep Learning. PhD Thesis. Rīga: [RTU], 2021. 167 p.

Publication language
English (en)
The Scientific Library of the Riga Technical University.
E-mail:; Phone: +371 28399196