UniGraph is a foundation model designed for Text-Attributed Graphs (TAGs), enabling cross-domain generalization. The model leverages textual features to unify node representations, even for graphs without inherent textual features, such as molecular graphs. It introduces a cascaded architecture combining Language Models (LMs) and Graph Neural Networks (GNNs) as the backbone. A novel pre-training algorithm based on Masked Graph Modeling (MGM) is proposed for large-scale self-supervised learning on TAGs. Graph instruction tuning using Large Language Models (LLMs) enables zero-shot prediction. Comprehensive experiments across various graph learning tasks and domains demonstrate the model's effectiveness in self-supervised representation learning, few-shot in-context transfer, and zero-shot transfer, often surpassing or matching the performance of GNNs trained on target datasets. The model's ability to generalize across domains is attributed to its use of text as a unifying medium, allowing it to handle diverse graph structures and tasks. UniGraph's framework is evaluated on 11 different graph datasets from 5 domains, with the largest graph containing 111 million nodes. The model outperforms other cross-domain methods and supervised methods trained on target datasets. The framework also shows strong performance in few-shot and zero-shot transfer scenarios, demonstrating its versatility and effectiveness in graph learning tasks.UniGraph is a foundation model designed for Text-Attributed Graphs (TAGs), enabling cross-domain generalization. The model leverages textual features to unify node representations, even for graphs without inherent textual features, such as molecular graphs. It introduces a cascaded architecture combining Language Models (LMs) and Graph Neural Networks (GNNs) as the backbone. A novel pre-training algorithm based on Masked Graph Modeling (MGM) is proposed for large-scale self-supervised learning on TAGs. Graph instruction tuning using Large Language Models (LLMs) enables zero-shot prediction. Comprehensive experiments across various graph learning tasks and domains demonstrate the model's effectiveness in self-supervised representation learning, few-shot in-context transfer, and zero-shot transfer, often surpassing or matching the performance of GNNs trained on target datasets. The model's ability to generalize across domains is attributed to its use of text as a unifying medium, allowing it to handle diverse graph structures and tasks. UniGraph's framework is evaluated on 11 different graph datasets from 5 domains, with the largest graph containing 111 million nodes. The model outperforms other cross-domain methods and supervised methods trained on target datasets. The framework also shows strong performance in few-shot and zero-shot transfer scenarios, demonstrating its versatility and effectiveness in graph learning tasks.