Text Classification Using Neural Network Language Model (NNLM) and BERT: An Empirical Comparison
Lecture Notes in Networks and Systems
New York, NY
First page number:
Last page number:
Text Classification is one of the most cited applications of Natural Language Processing. Classification can save the cost of manual efforts and at the same time increase the accuracy of a task. With multiple advancements in language modeling techniques over the last two decades, a number of word embedding models have been proposed. In this study, we discuss and compare two of the most recent models for the task text classification and present a technical comparison.
BERT; Language model; Natural Language Processing; NLP; Text classification; Transformers; Word embedding
Other Computer Sciences | Programming Languages and Compilers
Text Classification Using Neural Network Language Model (NNLM) and BERT: An Empirical Comparison.
Lecture Notes in Networks and Systems, 296
New York, NY: Springer.