导航
登录 English
陈俊帆
点赞:
陈俊帆
点赞:
论文
DualCL: Principled Supervised Contrastive Learning as Mutual Information Maximization for Text Classification
发布时间:2025-10-22点击次数:
发表刊物: Proceedings of the ACM on Web Conference 2024 (WWW), CCF-A
摘要: Text classification is a fundamental task in web content mining. Although the existing supervised contrastive learning (SCL) approach combined with pre-trained language models (PLMs) has achieved leading performance in text classification, it lacks fundamental principles. Theoretically motivated by a derived lower bound of mutual information maximization, we propose a dual contrastive learning framework DualCL that satisfies three properties, i.e., parameter-free, augmentation-easy and label-aware. DualCL generates classifier parameters from the PLM and simultaneously uses them for classification and as augmented views of the input text for supervised contrastive learning. Extensive experiments conclusively demonstrate that DualCL excels in learning superior text representations and consistently outperforms baseline models.
合写作者: 陈俊帆,张日崇, Yaowei Zheng, Qianben Chen,胡春明, Yongyi Mao
论文类型: 国际学术会议
页面范围: 4362-4371
是否译文:
发表时间: 2024-01-01