A Comprehensive Survey of Sentence Representations: From the BERT Epoch to the CHATGPT Era and Beyond

Abhinav Ramesh Kashyap, Thanh-Tung Nguyen, Viktor Schlegel, Stefan Winkler, See-Kiong Ng, Soujanya Poria

Main: Information Retrieval and Text Mining Oral Paper

Session 8: Information Retrieval and Text Mining (Oral)
Conference Room: Carlson
Conference Time: March 19, 16:00-17:30 (CET) (Europe/Malta)
TLDR:
You can open the #paper-252-Oral channel in a separate window.
Abstract: Sentence representations are a critical component in NLP applications such as retrieval, question answering, and text classification. They capture the meaning of a sentence, enabling machines to understand and reason over human language. In recent years, significant progress has been made in developing methods for learning sentence representations, including unsupervised, supervised, and transfer learning approaches. However there is no literature review on sentence representations till now. In this paper, we provide an overview of the different methods for sentence representation learning, focusing mostly on deep learning models. We provide a systematic organization of the literature, highlighting the key contributions and challenges in this area. Overall, our review highlights the importance of this area in natural language processing, the progress made in sentence representation learning, and the challenges that remain. We conclude with directions for future research, suggesting potential avenues for improving the quality and efficiency of sentence representations.