17 May 2024 | Xinhao Zhang, Zaitian Wang, Lu Jiang, Wanfu Gao, Pengfei Wang, Kunpeng Liu
The paper introduces TFWT (Tabular Feature Weighting with Transformer), a novel feature weighting method for tabular data. Traditional methods often assume equal importance across all samples and features, which can overlook unique contributions and lead to suboptimal performance in complex datasets. TFWT leverages the Transformer model's attention mechanism to capture complex feature dependencies and contextually assign appropriate weights to discrete and continuous features. The method employs reinforcement learning to fine-tune the weighting process, reducing information redundancy and improving model stability. Extensive experiments on various real-world datasets and downstream tasks demonstrate the effectiveness of TFWT, showing significant performance improvements compared to raw classifiers and baseline models. The fine-tuning strategy further enhances the method's performance by reducing variance in downstream tasks.The paper introduces TFWT (Tabular Feature Weighting with Transformer), a novel feature weighting method for tabular data. Traditional methods often assume equal importance across all samples and features, which can overlook unique contributions and lead to suboptimal performance in complex datasets. TFWT leverages the Transformer model's attention mechanism to capture complex feature dependencies and contextually assign appropriate weights to discrete and continuous features. The method employs reinforcement learning to fine-tune the weighting process, reducing information redundancy and improving model stability. Extensive experiments on various real-world datasets and downstream tasks demonstrate the effectiveness of TFWT, showing significant performance improvements compared to raw classifiers and baseline models. The fine-tuning strategy further enhances the method's performance by reducing variance in downstream tasks.