Can we Soft Prompt LLMs for Graph Learning Tasks?

Can we Soft Prompt LLMs for Graph Learning Tasks?

May 13–17, 2024 | Zheyuan Liu, Xiaoxin He, Yijun Tian, Nitesh V. Chawla
Can we Soft Prompt LLMs for Graph Learning Tasks? Graphs are crucial for representing complex relationships in real-world applications like social networks, biological data, and citation networks. Large Language Models (LLMs) have shown great success in various domains, making their application to graphs appealing. However, applying LLMs directly to graph data is challenging due to the mismatch between graph and text modalities. To address this, we introduce GraphPrompter, a novel framework that aligns graph information with LLMs via soft prompts. GraphPrompter consists of a graph neural network (GNN) to encode graph information and an LLM to process textual information. Comprehensive experiments on various benchmark datasets demonstrate the effectiveness of our method. The framework shows that LLMs can be effectively used for graph-related tasks, enabling more efficient utilization of LLMs in real-world graph scenarios. GraphPrompter uses GNNs to generate node embeddings, which are then concatenated with prompt instructions to guide LLMs. The LLM processes the fused graph and text information, generating responses that help in graph learning tasks. The framework is particularly suitable for textual graphs, which require understanding both textual content and graph structure. Experiments on five benchmark datasets show that GraphPrompter performs well in node classification and link prediction tasks. The results indicate that GraphPrompter outperforms other methods, demonstrating the potential of LLMs in complex data structures beyond traditional text. Our contributions include the first investigation into whether LLMs can understand graph learning tasks via soft prompting, the proposal of GraphPrompter, and extensive experiments showing its effectiveness across various graph benchmarks.Can we Soft Prompt LLMs for Graph Learning Tasks? Graphs are crucial for representing complex relationships in real-world applications like social networks, biological data, and citation networks. Large Language Models (LLMs) have shown great success in various domains, making their application to graphs appealing. However, applying LLMs directly to graph data is challenging due to the mismatch between graph and text modalities. To address this, we introduce GraphPrompter, a novel framework that aligns graph information with LLMs via soft prompts. GraphPrompter consists of a graph neural network (GNN) to encode graph information and an LLM to process textual information. Comprehensive experiments on various benchmark datasets demonstrate the effectiveness of our method. The framework shows that LLMs can be effectively used for graph-related tasks, enabling more efficient utilization of LLMs in real-world graph scenarios. GraphPrompter uses GNNs to generate node embeddings, which are then concatenated with prompt instructions to guide LLMs. The LLM processes the fused graph and text information, generating responses that help in graph learning tasks. The framework is particularly suitable for textual graphs, which require understanding both textual content and graph structure. Experiments on five benchmark datasets show that GraphPrompter performs well in node classification and link prediction tasks. The results indicate that GraphPrompter outperforms other methods, demonstrating the potential of LLMs in complex data structures beyond traditional text. Our contributions include the first investigation into whether LLMs can understand graph learning tasks via soft prompting, the proposal of GraphPrompter, and extensive experiments showing its effectiveness across various graph benchmarks.
Reach us at info@study.space
[slides and audio] Can we Soft Prompt LLMs for Graph Learning Tasks%3F