LogFormer is a pre-training and tuning pipeline for log anomaly detection, designed to improve generalization across different domains. The paper introduces a two-stage approach: pre-training on the source domain to capture shared semantic knowledge and adapter-based tuning to transfer this knowledge to the target domain. A key component of LogFormer is the Log-Attention module, which supplements information ignored by log-parsing to enhance performance. The model is evaluated on three public and one real-world dataset, demonstrating its effectiveness with fewer trainable parameters and lower training costs. LogFormer outperforms existing methods in terms of F1 scores, showing strong performance on multi-domain log anomaly detection. The approach also includes an adapter-based tuning mechanism that allows for efficient transfer learning, reducing the need for retraining models for different datasets. The paper also presents ablation studies showing the importance of pre-training and adapter-based tuning, as well as the effectiveness of the Log-Attention module in capturing semantic information. LogFormer is shown to be robust in low-resource settings and performs well on real-world datasets, demonstrating its practical utility in cloud service environments. The model is also compared with large language models, highlighting the importance of log-attention in capturing the semantics of log data. Overall, LogFormer provides a novel and effective solution for log anomaly detection across different domains.LogFormer is a pre-training and tuning pipeline for log anomaly detection, designed to improve generalization across different domains. The paper introduces a two-stage approach: pre-training on the source domain to capture shared semantic knowledge and adapter-based tuning to transfer this knowledge to the target domain. A key component of LogFormer is the Log-Attention module, which supplements information ignored by log-parsing to enhance performance. The model is evaluated on three public and one real-world dataset, demonstrating its effectiveness with fewer trainable parameters and lower training costs. LogFormer outperforms existing methods in terms of F1 scores, showing strong performance on multi-domain log anomaly detection. The approach also includes an adapter-based tuning mechanism that allows for efficient transfer learning, reducing the need for retraining models for different datasets. The paper also presents ablation studies showing the importance of pre-training and adapter-based tuning, as well as the effectiveness of the Log-Attention module in capturing semantic information. LogFormer is shown to be robust in low-resource settings and performs well on real-world datasets, demonstrating its practical utility in cloud service environments. The model is also compared with large language models, highlighting the importance of log-attention in capturing the semantics of log data. Overall, LogFormer provides a novel and effective solution for log anomaly detection across different domains.