Dzung Phan, Vinicius Lima
INFORMS 2023
Transformer-based language models have become the de facto standard in natural language processing. However, they underperform in the tabular data domain compared to traditional tree-based methods. We posit that current models fail to achieve the full potential of language models due to (i) heterogeneity of tabular data; and (ii) challenges faced by the model in interpreting numerical values. Based on this hypothesis, we propose the Tabular Domain Transformer (TDTransformer) framework. TDTransformer has distinct embedding processes for different types of columns. The alignment layers for different column-types transform these embeddings to a common space. Besides, TDTransformer adapts piece-wise linear encoding for numerical values for better performance. We test the proposed method on 76 real-world tabular classification datasets from the OpenML benchmark. Extensive experiments indicate that TDTransformer improves the state-of-the-art methods.
Dzung Phan, Vinicius Lima
INFORMS 2023
Malte Rasch, Tayfun Gokmen, et al.
arXiv
Sijia Liu, Pin-Yu Chen, et al.
IEEE SPM
Lars Graf, Thomas Bohnstingl, et al.
NeurIPS 2025