Sensors & Transducers

Vol. 249, Issue 2, February 2021, pp. 110-118

* Roghayeh Soleymani, Julien Beaulieu and Jérémie Farret

Inmind Technologies Inc., Montreal, Canada

Tel.: + 1(514) 871-0470

* E-mail:

Received: 30 November 2020 /Accepted: 15 January 2021 /Published: 28 February 2021

Abstract: In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with four different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformers in terms of accuracy and training speed for text classification. However, Reformers allow the training of bigger models which would otherwise cause memory failures with Transformers.

Keywords: Natural Language Processing, Text Classification, Transformers, Reformers, Trax, Mind in a Box.