eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data

eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data

2024 | Bo Peng * 1 Xinyi Ling * 1 Ziru Chen 1 Huan Sun 1 2 Xia Ning 1 2 3
This paper addresses the challenges of generalizing large language models (LLMs) for e-commerce, particularly in handling new users and products. It introduces ECInstruct, a large-scale, high-quality benchmark instruction dataset for e-commerce, and eCeLLM, a series of e-commerce LLMs developed by instruction-tuning general-purpose LLMs. ECInstruct covers 10 diverse e-commerce tasks across 4 categories, ensuring broad coverage, realistic tasks, and high data quality. eCeLLM models, trained on ECInstruct, outperform baseline models, including advanced LLMs like GPT-4, and state-of-the-art task-specific models in in-domain evaluation. They also demonstrate excellent generalizability to out-of-domain settings, showing significant improvements on unseen products and instructions. The study highlights the potential of LLMs in e-commerce and provides a comprehensive, systematic approach to instruction-tuning for e-commerce applications. Both ECInstruct and eCeLLM models are publicly available for further research and development.This paper addresses the challenges of generalizing large language models (LLMs) for e-commerce, particularly in handling new users and products. It introduces ECInstruct, a large-scale, high-quality benchmark instruction dataset for e-commerce, and eCeLLM, a series of e-commerce LLMs developed by instruction-tuning general-purpose LLMs. ECInstruct covers 10 diverse e-commerce tasks across 4 categories, ensuring broad coverage, realistic tasks, and high data quality. eCeLLM models, trained on ECInstruct, outperform baseline models, including advanced LLMs like GPT-4, and state-of-the-art task-specific models in in-domain evaluation. They also demonstrate excellent generalizability to out-of-domain settings, showing significant improvements on unseen products and instructions. The study highlights the potential of LLMs in e-commerce and provides a comprehensive, systematic approach to instruction-tuning for e-commerce applications. Both ECInstruct and eCeLLM models are publicly available for further research and development.
Reach us at info@study.space