no code implementations • 29 Sep 2023 • Maximilian Schambach, Dominique Paul, Johannes S. Otterbach
To analyze the scaling potential of deep tabular representation learning models, we introduce a novel Transformer-based architecture specifically tailored to tabular data and cross-table representation learning by utilizing table-specific tokenizers and a shared Transformer backbone.