Haigen Hu , Xiaoyuan Wang , Yan Zhang , Qi Chen , Qiu Guan
{"title":"对比学习综合调查","authors":"Haigen Hu , Xiaoyuan Wang , Yan Zhang , Qi Chen , Qiu Guan","doi":"10.1016/j.neucom.2024.128645","DOIUrl":null,"url":null,"abstract":"<div><div>Contrastive Learning is self-supervised representation learning by training a model to differentiate between similar and dissimilar samples. It has been shown to be effective and has gained significant attention in various computer vision and natural language processing tasks. In this paper, we comprehensively and systematically sort out the main ideas, recent developments and application areas of contrastive learning. Specifically, we firstly provide an overview of the research activity of contrastive learning in recent years. Secondly, we describe the basic principles and summarize a universal framework of contrastive learning. Thirdly, we further introduce and discuss the latest advances of each functional component in detail, including data augmentation, positive/negative samples,network structure, and loss function. Finally, we summarize contrastive learning and discuss the challenges, future research trends and development directions in the area of contrastive learning.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"610 ","pages":"Article 128645"},"PeriodicalIF":5.5000,"publicationDate":"2024-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A comprehensive survey on contrastive learning\",\"authors\":\"Haigen Hu , Xiaoyuan Wang , Yan Zhang , Qi Chen , Qiu Guan\",\"doi\":\"10.1016/j.neucom.2024.128645\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Contrastive Learning is self-supervised representation learning by training a model to differentiate between similar and dissimilar samples. It has been shown to be effective and has gained significant attention in various computer vision and natural language processing tasks. In this paper, we comprehensively and systematically sort out the main ideas, recent developments and application areas of contrastive learning. Specifically, we firstly provide an overview of the research activity of contrastive learning in recent years. Secondly, we describe the basic principles and summarize a universal framework of contrastive learning. Thirdly, we further introduce and discuss the latest advances of each functional component in detail, including data augmentation, positive/negative samples,network structure, and loss function. Finally, we summarize contrastive learning and discuss the challenges, future research trends and development directions in the area of contrastive learning.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"610 \",\"pages\":\"Article 128645\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224014164\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224014164","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Contrastive Learning is self-supervised representation learning by training a model to differentiate between similar and dissimilar samples. It has been shown to be effective and has gained significant attention in various computer vision and natural language processing tasks. In this paper, we comprehensively and systematically sort out the main ideas, recent developments and application areas of contrastive learning. Specifically, we firstly provide an overview of the research activity of contrastive learning in recent years. Secondly, we describe the basic principles and summarize a universal framework of contrastive learning. Thirdly, we further introduce and discuss the latest advances of each functional component in detail, including data augmentation, positive/negative samples,network structure, and loss function. Finally, we summarize contrastive learning and discuss the challenges, future research trends and development directions in the area of contrastive learning.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.