V2Xum-LLM: Cross-Modal Video Summarization with Temporal Prompt Instruction Tuning

V2Xum-LLM: Cross-Modal Video Summarization with Temporal Prompt Instruction Tuning

20 Aug 2024 | Hang Hua*, Yunlong Tang*, Chenliang Xu, Jiebo Luo†
The paper introduces Instruct-V2Xum, a large-scale cross-modal video summarization dataset and V2Xum-LLaMA, a novel framework for video summarization. Instruct-V2Xum contains 30,000 diverse YouTube videos, each paired with a textual summary that references specific frame indexes, facilitating the generation of aligned video and textual summaries. V2Xum-LLaMA unifies different video summarization tasks into one large language model's text decoder, enabling task-controllable video summarization with temporal prompts and task instructions. The framework outperforms strong baseline models on multiple video summarization tasks, including video-to-video (V2V), video-to-text (V2T), and video and text (V2VT) summarization. The paper also proposes enhanced evaluation metrics, $F_{CLIP}$ and $Cross-F_{CLIP}$, for V2V and V2VT summarization tasks, respectively, to address the limitations of existing metrics.The paper introduces Instruct-V2Xum, a large-scale cross-modal video summarization dataset and V2Xum-LLaMA, a novel framework for video summarization. Instruct-V2Xum contains 30,000 diverse YouTube videos, each paired with a textual summary that references specific frame indexes, facilitating the generation of aligned video and textual summaries. V2Xum-LLaMA unifies different video summarization tasks into one large language model's text decoder, enabling task-controllable video summarization with temporal prompts and task instructions. The framework outperforms strong baseline models on multiple video summarization tasks, including video-to-video (V2V), video-to-text (V2T), and video and text (V2VT) summarization. The paper also proposes enhanced evaluation metrics, $F_{CLIP}$ and $Cross-F_{CLIP}$, for V2V and V2VT summarization tasks, respectively, to address the limitations of existing metrics.
Reach us at info@study.space