Gang Liu, Michael Sun, et al.
ICLR 2025
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve per¬formance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This mo¬tivates research into efficient methods that require fewer resources to achieve similar re¬sults. This survey synthesizes and relates cur¬rent methods and findings in efficient NLP. We aim to provide both guidance for con-ducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.
Gang Liu, Michael Sun, et al.
ICLR 2025
Lars Graf, Thomas Bohnstingl, et al.
NeurIPS 2025
Wooseok Choi, Tommaso Stecconi, et al.
Advanced Science
Rama Akkiraju, Pinar Keskinocak, et al.
Applied Intelligence