The paper introduces the Attention-Based Convolutional Neural Network (ABCNN) for modeling sentence pairs, addressing the critical issue of how to effectively model pairs of sentences in various NLP tasks such as answer selection (AS), paraphrase identification (PI), and textual entailment (TE). The authors propose three attention schemes that integrate mutual influence between sentences into CNNs, enhancing the representation of each sentence by considering its counterpart. The ABCNN is a general architecture that can be applied to a wide range of sentence pair modeling tasks. The paper presents three ABCNN architectures (ABCNN-1, ABCNN-2, and ABCNN-3) that differ in their attention mechanisms, with ABCNN-3 combining elements of the other two to achieve better performance. Experiments on AS, PI, and TE tasks demonstrate that the attention-based CNNs outperform non-attention CNNs and achieve state-of-the-art performance in AS and TE tasks, and competitive performance in PI. The authors also release the code for their system at a specified GitHub repository.The paper introduces the Attention-Based Convolutional Neural Network (ABCNN) for modeling sentence pairs, addressing the critical issue of how to effectively model pairs of sentences in various NLP tasks such as answer selection (AS), paraphrase identification (PI), and textual entailment (TE). The authors propose three attention schemes that integrate mutual influence between sentences into CNNs, enhancing the representation of each sentence by considering its counterpart. The ABCNN is a general architecture that can be applied to a wide range of sentence pair modeling tasks. The paper presents three ABCNN architectures (ABCNN-1, ABCNN-2, and ABCNN-3) that differ in their attention mechanisms, with ABCNN-3 combining elements of the other two to achieve better performance. Experiments on AS, PI, and TE tasks demonstrate that the attention-based CNNs outperform non-attention CNNs and achieve state-of-the-art performance in AS and TE tasks, and competitive performance in PI. The authors also release the code for their system at a specified GitHub repository.