Contrastive sentence representation learning with adaptive false negative cancellation
Article
Xu, Lingling, Xie, Haoran, Wang, Fu Lee, Tao, Xiaohui, Wang, Weiming and Li, Qing. 2024. "Contrastive sentence representation learning with adaptive false negative cancellation." Information Fusion. 102. https://doi.org/10.1016/j.inffus.2023.102065
Article Title | Contrastive sentence representation learning with adaptive false negative cancellation |
---|---|
ERA Journal ID | 20983 |
Article Category | Article |
Authors | Xu, Lingling, Xie, Haoran, Wang, Fu Lee, Tao, Xiaohui, Wang, Weiming and Li, Qing |
Journal Title | Information Fusion |
Journal Citation | 102 |
Article Number | 102065 |
Number of Pages | 12 |
Year | 2024 |
Publisher | Elsevier |
ISSN | 1566-2535 |
1872-6305 | |
Digital Object Identifier (DOI) | https://doi.org/10.1016/j.inffus.2023.102065 |
Web Address (URL) | https://www.sciencedirect.com/science/article/pii/S1566253523003810 |
Abstract | Contrastive sentence representation learning has made great progress thanks to a range of text augmentation strategies and hard negative sampling techniques. However, most studies directly employ in-batch samples as negative samples, ignoring the semantic relationship between negative samples and anchors, which may lead to negative sampling bias. To address this issue, we propose similarity and relative-similarity strategies for identifying potential false negatives. Moreover, we introduce adaptive false negative elimination and attraction methods to mitigate their adverse effects. Our proposed approaches can also be considered semi-supervised contrastive learning, as the identified false negatives can be viewed as either negative or positive samples for contrastive learning in adaptive false negative elimination and attraction methods. By fusing information from positive and negative pairs, contrastive learning learns rich and discriminative representations that capture the intrinsic characteristics of the sentence. Experimental results indicate that our proposed strategies and methods can bring further significant performance improvements. Specifically, the combination of similarity strategy and adaptive false negative elimination method achieves the best results, yielding an average performance gain of 2.1% compared to SimCSE in semantic textual similarity (STS) tasks. Furthermore, our approach is generalizable and can be applied to different text data augmentation strategies and certain existing contrastive sentence representation learning models. Our experimental code and data are publicly available at the link: https://github.com/Linda230/AFNC. |
Keywords | Adaptive weight; Contrastive learning; Sentence representation learning; Negative sampling bias; Semi-supervised learning |
Contains Sensitive Content | Does not contain sensitive content |
ANZSRC Field of Research 2020 | 460508. Information retrieval and web search |
Byline Affiliations | Hong Kong Metropolitan University, China |
Lingnan University of Hong Kong, China | |
School of Mathematics, Physics and Computing | |
Hong Kong Polytechnic University, China |
Permalink -
https://research.usq.edu.au/item/z5vz6/contrastive-sentence-representation-learning-with-adaptive-false-negative-cancellation
Download files
63
total views0
total downloads8
views this month0
downloads this month