Tanveer, Muhammad and Tan, Hung-Khoon and Ng, Hui-Fuang and Leung, Maylor Karhang and Chuah, Joon Huang (2021) Regularization of deep neural network with batch contrastive loss. IEEE Access, 9. pp. 124409-124418. ISSN 2169-3536, DOI https://doi.org/10.1109/ACCESS.2021.3110286.
Full text not available from this repository.Abstract
Neural networks have become deeper in recent years and this has improved its capacity to handle more complex tasks. However, deep neural network has more parameters and is easier to overfit, especially when training samples are insufficient. In this paper, we present a new regularization technique called batch contrastive regularization to improve generalization performance. The loss function is based on contrastive loss which enforces intra-class compactness and inter-class separability of batch samples. We explore three different contrastive losses: (1) the center contrastive loss which regularizes based on distances between data points and their corresponding class centroid, (2) the sample contrastive loss which is based on batch sample-pair distances, and (3) the multicenter loss which is similar to center contrastive loss except that the cluster centers are discovered from training. The proposed network has two heads, one for classification and the other for regularization. The regularization head is discarded during inference. We also introduce bag sampling to ensure that all classes in a batch are well represented. The performance of the proposed architecture is evaluated on the CIFAR-10 and CIFAR-100 datasets. Our experiments show that network regularized by batch contrastive loss display impressive generalization performance over a wide variety of classes, yielding more than 11% improvement for ResNet50 on CIFAR-100 when trained from scratch.
Item Type: | Article |
---|---|
Funders: | Fundamental Research Grant Scheme (FRGS) through the Ministry of Higher Education (MOHE) of Malaysia (FRGS/1/2018/ICT02/UTAR/02/03) |
Uncontrolled Keywords: | Training; Task analysis; Neural networks; Neurons; Licenses; Network architecture; Head; Batch contrastive loss; batch regularization; center-level contrastive loss; sample-level contrastive loss; multicenter loss; neural network |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science T Technology > TA Engineering (General). Civil engineering (General) |
Divisions: | Faculty of Engineering > Department of Electrical Engineering |
Depositing User: | Ms Zaharah Ramly |
Date Deposited: | 25 Jul 2022 04:06 |
Last Modified: | 25 Jul 2022 04:06 |
URI: | http://eprints.um.edu.my/id/eprint/28105 |
Actions (login required)
View Item |