OmniPred: Language Models as Universal Regressors

OmniPred: Language Models as Universal Regressors

30 Jan 2025 | Xingyou Song*1, Oscar Li*†2, Chansoo Lee1, Bangding (Jeffrey) Yang3, Daiyi Peng1, Sagi Perel1 and Yutian Chen1
OmniPred is a framework that trains language models as universal end-to-end regressors for $(x, y)$ data from various formats. The authors propose OmniPred to address the limitations of traditional regression methods, which are often task-specific and require fixed-length tensor representations. By leveraging multi-task learning and textual representations, OmniPred can perform precise numerical regression using only textual representations of mathematical parameters and values. The framework is evaluated using data from Google Vizier, a large proprietary blackbox optimization database, demonstrating its ability to outperform traditional regression models in many cases. The paper also discusses the benefits of multi-task training, the effectiveness of online fine-tuning, and the model's ability to handle unseen tasks. The authors highlight the potential of language models in experimental design and provide insights into their limitations and future directions.OmniPred is a framework that trains language models as universal end-to-end regressors for $(x, y)$ data from various formats. The authors propose OmniPred to address the limitations of traditional regression methods, which are often task-specific and require fixed-length tensor representations. By leveraging multi-task learning and textual representations, OmniPred can perform precise numerical regression using only textual representations of mathematical parameters and values. The framework is evaluated using data from Google Vizier, a large proprietary blackbox optimization database, demonstrating its ability to outperform traditional regression models in many cases. The paper also discusses the benefits of multi-task training, the effectiveness of online fine-tuning, and the model's ability to handle unseen tasks. The authors highlight the potential of language models in experimental design and provide insights into their limitations and future directions.
Reach us at info@study.space
[slides] OmniPred%3A Language Models as Universal Regressors | StudySpace