SENTIMENT ANALYSIS BASED ON USER REVIEWS IN E-COMMERCE SITE

International Journal of Computer Science (IJCS Journal) Published by SK Research Group of Companies (SKRGC) Scholarly Peer Reviewed Research Journals

Format: Volume 5, Issue 1, No 28, 2017

Copyright: All Rights Reserved ©2017

Year of Publication: 2017

Author: G. Rajadurai, Mrs. D. Vinotha.

Reference:IJCS-293

View PDF Format

Abstract

Learning sentiment-specific word embeddings dubbed sentiment embeddings is proposed in this paper. Existing word embedding learning algorithms typically only use the contexts of words but ignore the sentiment of texts. It is problematic for sentiment analysis because the words with similar contexts but opposite sentiment polarity, such as good and bad, are mapped to neighboring word vectors. This issue is addressed by encoding sentiment information of texts (e.g. sentences and words) together with contexts of words in sentiment embeddings. By combining context and sentiment level evidences, the nearest neighbors in sentiment embedding space are semantically similar and it favors words with the same sentiment polarity. In order to learn sentiment embeddings effectively, a number of neural networks with tailoring loss functions, and collect massive texts automatically with sentiment signals like emoticons as the training data is developed. Sentiment embeddings can be naturally used as word features for a variety of sentiment analysis tasks without feature engineering. Sentiment embeddings is applied to word-level sentiment analysis, sentence level sentiment classification and building sentiment lexicons. Experimental results show that sentiment embeddings consistently outperform context-based embeddings on several benchmark datasets of these tasks.

References


Keywords

 D. Tang, F. Wei, B. Qin, T. Liu, and M. Zhou, “Coooolll: A deep learning system for twitter sentiment classification,” in Proc. 8th Int. Workshop Semantic Eval., 2014, pp. 208–212.  C. D. Manning and H. Sch€utze, Foundations of Statistical Natural Language Processing. Cambridge, MA, USA: MIT Press, 1999.  D. Jurafsky and H. James, Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Englewood Cliffs, NJ, USA: Prentice-Hall, 2000.  Y. Bengio, R. Ducharme, P. Vincent, and C. Janvin, “A neural probabilistic language model,” J. Mach. Learning Res., vol. 3, pp. 1137–1155, 2003.  T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean, “Distributed representations of words and phrases and their compositionality,” in Proc. Conf. Neural Inf. Process. Syst., 2013, pp. 3111–3119.  J. Pennington, R. Socher, and C. Manning, “Glove: Global vectors for word representation,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2014, pp. 1532–1543.

This work is licensed under a Creative Commons Attribution 3.0 Unported License.   

TOP
Facebook IconYouTube IconTwitter IconVisit Our Blog