P. Trespeuch, Y. Fournier, et al.
Civil-Comp Proceedings
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve per¬formance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This mo¬tivates research into efficient methods that require fewer resources to achieve similar re¬sults. This survey synthesizes and relates cur¬rent methods and findings in efficient NLP. We aim to provide both guidance for con-ducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.
P. Trespeuch, Y. Fournier, et al.
Civil-Comp Proceedings
Baihan Lin, Guillermo Cecchi, et al.
IJCAI 2023
Robert Farrell, Rajarshi Das, et al.
AAAI-SS 2010
Hannaneh Hajishirzi, Julia Hockenmaier, et al.
UAI 2011