GraphTranslator: Aligning Graph Model to Large Language Model for Open-ended Tasks

GraphTranslator: Aligning Graph Model to Large Language Model for Open-ended Tasks

May 13-17, 2024 | Mengmei Zhang, Mingwei Sun, Peng Wang, Shen Fan, Yanhu Mo, Xiaoxiao Xu, Hong Liu, Cheng Yang, Chuan Shi
GraphTranslator is a framework that aligns graph models (GMs) with large language models (LLMs) to handle both pre-defined and open-ended tasks. The framework introduces a Translator module to bridge the modality gap between GMs and LLMs by converting node embeddings into token embeddings. A Producer module is also introduced to generate alignment data by textualizing information encoded in node embeddings. The Translator module is trained to project GM node embeddings into LLM space, enabling LLMs to process and interpret graph information. The framework is evaluated on real-world datasets, demonstrating its effectiveness in zero-shot node classification and graph question answering. The results show that GraphTranslator can effectively handle both pre-defined and open-ended tasks by leveraging the strengths of GMs and LLMs. The framework is designed to provide a unified perspective for both types of tasks, with the LLM serving as an interface for open-ended tasks and the GM handling pre-defined tasks. The experimental results indicate that GraphTranslator outperforms existing methods in zero-shot node classification and demonstrates the potential for a wide range of open-ended tasks. The framework is implemented using PyTorch and PyTorch Geometric, and the code is available at https://github.com/alibaba/GraphTranslator.GraphTranslator is a framework that aligns graph models (GMs) with large language models (LLMs) to handle both pre-defined and open-ended tasks. The framework introduces a Translator module to bridge the modality gap between GMs and LLMs by converting node embeddings into token embeddings. A Producer module is also introduced to generate alignment data by textualizing information encoded in node embeddings. The Translator module is trained to project GM node embeddings into LLM space, enabling LLMs to process and interpret graph information. The framework is evaluated on real-world datasets, demonstrating its effectiveness in zero-shot node classification and graph question answering. The results show that GraphTranslator can effectively handle both pre-defined and open-ended tasks by leveraging the strengths of GMs and LLMs. The framework is designed to provide a unified perspective for both types of tasks, with the LLM serving as an interface for open-ended tasks and the GM handling pre-defined tasks. The experimental results indicate that GraphTranslator outperforms existing methods in zero-shot node classification and demonstrates the potential for a wide range of open-ended tasks. The framework is implemented using PyTorch and PyTorch Geometric, and the code is available at https://github.com/alibaba/GraphTranslator.
Reach us at info@study.space
[slides and audio] GraphTranslator%3A Aligning Graph Model to Large Language Model for Open-ended Tasks