MambaTab is a plug-and-play model for learning tabular data, based on a structured state-space model (SSM), specifically leveraging Mamba, an emerging SSM variant. Unlike traditional models like CNNs and Transformers, MambaTab offers efficient, scalable, and generalizable performance with significantly fewer parameters. It excels in tasks such as supervised learning, feature incremental learning, and self-supervised learning, achieving superior performance on diverse benchmark datasets with a small fraction of parameters compared to state-of-the-art baselines. MambaTab's architecture is simple, requiring minimal data preprocessing and aligning well with feature incremental learning. It demonstrates strong performance across various tabular learning tasks, including classification, regression, and self-supervised learning. The model's key advantages include its small model size, linear scalability, and effective end-to-end training with minimal data wrangling. MambaTab is designed to be a lightweight, adaptable solution for tabular data, suitable for systems with varying computational resources. The paper evaluates MambaTab extensively against leading tabular data models, showing its effectiveness in multiple settings, including vanilla supervised learning, self-supervised learning, and feature incremental learning. The results demonstrate that MambaTab outperforms existing methods in terms of performance, efficiency, and generalizability. The model's architecture is flexible and can be adapted to different learning tasks, making it a promising solution for practical applications in various domains.MambaTab is a plug-and-play model for learning tabular data, based on a structured state-space model (SSM), specifically leveraging Mamba, an emerging SSM variant. Unlike traditional models like CNNs and Transformers, MambaTab offers efficient, scalable, and generalizable performance with significantly fewer parameters. It excels in tasks such as supervised learning, feature incremental learning, and self-supervised learning, achieving superior performance on diverse benchmark datasets with a small fraction of parameters compared to state-of-the-art baselines. MambaTab's architecture is simple, requiring minimal data preprocessing and aligning well with feature incremental learning. It demonstrates strong performance across various tabular learning tasks, including classification, regression, and self-supervised learning. The model's key advantages include its small model size, linear scalability, and effective end-to-end training with minimal data wrangling. MambaTab is designed to be a lightweight, adaptable solution for tabular data, suitable for systems with varying computational resources. The paper evaluates MambaTab extensively against leading tabular data models, showing its effectiveness in multiple settings, including vanilla supervised learning, self-supervised learning, and feature incremental learning. The results demonstrate that MambaTab outperforms existing methods in terms of performance, efficiency, and generalizability. The model's architecture is flexible and can be adapted to different learning tasks, making it a promising solution for practical applications in various domains.