LogFormer is a pre-training and tuning pipeline for log anomaly detection, designed to improve generalization across different domains. The framework consists of two stages: pre-training and adapter-based tuning. During pre-training, the model learns shared semantic knowledge from the source domain. Then, the knowledge is transferred to the target domain using shared parameters. The Log-Attention module is introduced to capture information that is ignored by log-pairing, enhancing the model's ability to handle multi-domain logs. The proposed method is evaluated on three public and one real-world dataset, demonstrating its effectiveness with fewer trainable parameters and lower training costs.
The LogFormer framework addresses the challenges of log anomaly detection in multi-domain scenarios, where traditional methods struggle with generalization. It introduces a Log-Attention module to retain semantic information lost during log parsing and an adapter-based tuning stage to efficiently transfer knowledge from the source domain to the target domain. The model is trained using a pre-trained language model and a Log-Attention encoder, which is fine-tuned on the target domain with minimal parameter adjustments.
Experiments on multiple benchmark datasets show that LogFormer outperforms existing methods in terms of F1 score, precision, and recall. It achieves state-of-the-art performance on three public datasets and demonstrates robustness in low-resource settings. The model is also evaluated on a real-world dataset, showing its effectiveness in practical applications. LogFormer's design allows for efficient adaptation to new domains with minimal training cost, making it a promising solution for log anomaly detection in complex, multi-domain environments.LogFormer is a pre-training and tuning pipeline for log anomaly detection, designed to improve generalization across different domains. The framework consists of two stages: pre-training and adapter-based tuning. During pre-training, the model learns shared semantic knowledge from the source domain. Then, the knowledge is transferred to the target domain using shared parameters. The Log-Attention module is introduced to capture information that is ignored by log-pairing, enhancing the model's ability to handle multi-domain logs. The proposed method is evaluated on three public and one real-world dataset, demonstrating its effectiveness with fewer trainable parameters and lower training costs.
The LogFormer framework addresses the challenges of log anomaly detection in multi-domain scenarios, where traditional methods struggle with generalization. It introduces a Log-Attention module to retain semantic information lost during log parsing and an adapter-based tuning stage to efficiently transfer knowledge from the source domain to the target domain. The model is trained using a pre-trained language model and a Log-Attention encoder, which is fine-tuned on the target domain with minimal parameter adjustments.
Experiments on multiple benchmark datasets show that LogFormer outperforms existing methods in terms of F1 score, precision, and recall. It achieves state-of-the-art performance on three public datasets and demonstrates robustness in low-resource settings. The model is also evaluated on a real-world dataset, showing its effectiveness in practical applications. LogFormer's design allows for efficient adaptation to new domains with minimal training cost, making it a promising solution for log anomaly detection in complex, multi-domain environments.