Exponential Triplet Loss
            
            ICCDA 2020: Proceedings of the 2020 the 4th International Conference on Compute and Data Analysis
            2020
            
        
                Ēvalds Urtāns,
        
                Agris Ņikitenko,
        
                Valters Vēciņš
        
    
            
            
            This paper introduces a novel variant of the Triplet Loss function that converges faster and gives better results. This function can separate class instances homogeneously through the whole embedding space. With Exponential Triplet Loss function we also introduce a novel type of embedding space regularization Unit-Range and Unit-Bounce that utilizes euclidean space more efficiently and resembles features of the cosine distance. We also examined factors for choosing the best embedding vector size for specific embedding spaces. Finally, we also demonstrate how new function can train models for one-shot learning and re-identification tasks.
            
            
            
                Keywords
                Feature embedding, Identification, One-shot learning, Re-identification, Sample mining, Triplet loss
            
            
                DOI
                10.1145/3388142.3388163
            
            
                Hyperlink
                https://dl.acm.org/doi/10.1145/3388142.3388163
            
            
            Urtāns, Ē., Ņikitenko, A., Vēciņš, V. Exponential Triplet Loss. In: ICCDA 2020: Proceedings of the 2020 the 4th International Conference on Compute and Data Analysis, United States of America, San Jose, 9-12 March, 2020. New York: ACM, 2020, pp.152-158. ISBN 978-145037644-0. Available from: doi:10.1145/3388142.3388163
            
                Publication language
                English (en)