A Fine Tuned Universal Language Model Fine-Tuning (ULMFiT) Approach for Airline Twitter Sentiment Analysis

Authors

  • Kevin Njagi Central South University, Lushan road Changsha Hunan 410083,China
  • Zhang Zuping Central South University, Lushan road Changsha Hunan 410083,China
  • Gilbert Marisa Central South University, Lushan road Changsha Hunan 410083,China

Keywords:

Machine learning, Sentiment analysis, ULMFiT

Abstract

More researches have shown that real-time Twitter data can be used to predict market movement of securities and other financial instruments. The Universal Language Model Fine-Tuning (ULMFiT) is a new approach which is based on training a language model and transferring its knowledge to a final classifier. We propose to fine tune the ULMFiT model by optimizing the parameters and training the model in a deterministic approach to increase the reproducibility. In this paper, we performed multi-class classification using Fine-Tuned ULMFiT, Naive Bayes, SVM, Logistic Regression, Random Forest, Decision Tree, K-Nearest Neighbors, on the Twitter US Airline data set from Kaggle. A model is built firstly for six major U.S. airlines that performs sentiment analysis on customer reviews so that the airlines can have fast and concise feedback. Recommendations is made secondly on the most important aspect of services they could improve given customers complains. Significant accuracy has achieved, which shows that our models are reliable for future prediction. Also, the accuracy of different models is compared, and results show that Random Forest is the best approach.

References

Esi Adeborna, Keng Siau, “An Approach To Sentiment Analysis-The Case of Airline Quality Rating,” PACIS 2014 Proceedings. Paper 363.

Hung T. Vo, Hai C. Lam, Duc Dung Nguyen, Nguyen Huynh Tuong, “Topic Classification and Sentiment Analysis for Vietnamese Education Survey System”, Asian Journal of Computer Science and Information Technology 6:3, May (2016).

Soumi Sarkar, Taniya Seal, “Sentiment Analysis- An Objective View”, Journal of Research, Volume 02, Issue 02, April 2016.

Ann Devitt, Khurshid Ahmad, “Sentiment Analysis and The Use of Extrinsic Datasets in Evaluation,” International Conference on Language Resources and Evaluation,2008.

Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I. and Salakhutdinov, R. (2014). Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J Mach Learn Res 15, 1929-1958. J. Clerk Maxwell, A Treatise on Electricity and Magnetism, 3rd ed., vol. 2. Oxford: Clarendon, 1892, pp.68-73.

Ian Goodfellow, Yoshua Bengio & Aaron Courville (2016). Deep Learning. MIT PressK. Elissa,

Zagoruyko, S., & Komodakis, N. (2016). Wide Residual Networks. Proceedings of the British Machine Vision Conference 2016. doi:10.5244/c.30.87

He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep Residual Learning for Image Recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). doi:10.1109/cvpr.2016.90

Scikit-learn: http://scikit-learn.org/stable

http://www.itl.nist.gov/div898/handbook/apr/section4/apr412.html

J. Howard and S. Ruder, “Universal language model fine-tuning for text classification,” in Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, 2018, pp. 328–339

M. C. Mozer, “A focused backpropagation algorithm for temporal,” Backpropagation: Theory, architectures, and applications, p. 137, 1995

M. E. Peters, W. Ammar, C. Bhagavatula, and R. Power, “Semisupervised sequence tagging with bidirectional language models,” arXiv preprint arXiv:1705.00108, 2017.

S Kamal, N. Dey, A.S Ashour, s. Ripon, V.E. Balas and M. Kaysar, “FbMapping: An automated system for monitoring facebook data”,Neural Network World, 2017.

J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei, “Imagenet: A large-scale hierarchical image database,” in Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on. Ieee, 2009, pp. 248–255.

J. Donahue, Y. Jia, O. Vinyals, J. Hoffman, N. Zhang, E. Tzeng, and T. Darrell, “Decaf: A deep convolutional activation feature for generic visual recognition,” in International conference on machine learning, 2014, pp. 647–655.

J. Long, E. Shelhamer, and T. Darrell, “Fully convolutional networks for semantic segmentation,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 3431–3440.

M. D. d. S. Wanderley, L. d. A. e Bueno, C. Zanchettin, and A. L. Oliveira, “The impact of dataset complexity on transfer learning over convolutional neural networks,” in International Conference on Artificial Neural Networks. Springer, 2017, pp. 582–589.

J. Pennington, R. Socher, and C. Manning, “Glove: Global vectors for word representation,” in Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), 2014, pp. 1532–1543.

T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean, “Distributed representations of words and phrases and their compositionality,” in Advances in neural information processing systems, 2013, pp. 3111–3119.

P. Bojanowski, E. Grave, A. Joulin, and T. Mikolov, “Enriching word vectors with subword information,” Transactions of the Association for Computational Linguistics, vol. 5, pp. 135–146, 2017.

M. Rei, “Semi-supervised multitask learning for sequence labeling,” arXiv preprint arXiv:1704.07156, 2017.

L. Liu, J. Shang, X. Ren, F. F. Xu, H. Gui, J. Peng, and J. Han, “Empower sequence labeling with task-aware neural language model,” in Thirty-Second AAAI Conference on Artificial Intelligence, 2018.

K. Hashimoto, C. Xiong, Y. Tsuruoka, and R. Socher, “A joint manytask model: Growing a neural network for multiple nlp tasks,” arXiv preprint arXiv:1611.01587, 2016.

B. McCann, J. Bradbury, C. Xiong, and R. Socher, “Learned in translation: Contextualized word vectors,” in Advances in Neural Information Processing Systems, 2017, pp. 6294–6305.

S. Min, M. Seo, and H. Hajishirzi, “Question answering through transfer learning from large fine-grained supervision data,” arXiv preprint arXiv:1702.02171, 2017.

A. Severyn and A. Moschitti, “Unitn: Training deep convolutional neural network for twitter sentiment classification,” in Proceedings of the 9th IJCNN 2019. International Joint Conference on Neural Networks. Budapest, Hungary. 14-19 July 2019 - 6 - paper N-20204.pdfinternational workshop on semantic evaluation (SemEval 2015), 2015,pp. 464–469.

Downloads

Published

2020-05-22

How to Cite

Njagi, K. ., Zuping, Z. ., & Marisa, G. . (2020). A Fine Tuned Universal Language Model Fine-Tuning (ULMFiT) Approach for Airline Twitter Sentiment Analysis. International Journal of Sciences: Basic and Applied Research (IJSBAR), 52(1), 51–66. Retrieved from https://www.gssrr.org/index.php/JournalOfBasicAndApplied/article/view/11034

Issue

Section

Articles