This paper presents a novel neural network architecture for named entity recognition (NER) that combines bidirectional LSTM (BLSTM) and convolutional neural network (CNN) to automatically detect word- and character-level features. The authors propose a new method for encoding partial lexicon matches in neural networks and compare it to existing approaches. Extensive evaluation shows that the system, trained only on tokenized text and publicly available word embeddings, performs competitively on the CoNLL-2003 dataset and surpasses previous state-of-the-art performance on the OntoNotes 5.0 dataset by 2.13 F1 points. By using two lexicons constructed from publicly available sources, the system achieves new state-of-the-art performance with F1 scores of 91.62 on CoNLL-2003 and 86.28 on OntoNotes, outperforming systems that rely on heavy feature engineering, proprietary lexicons, and rich entity linking information. The paper also discusses the effectiveness of different word embeddings and the impact of dropout on model performance.This paper presents a novel neural network architecture for named entity recognition (NER) that combines bidirectional LSTM (BLSTM) and convolutional neural network (CNN) to automatically detect word- and character-level features. The authors propose a new method for encoding partial lexicon matches in neural networks and compare it to existing approaches. Extensive evaluation shows that the system, trained only on tokenized text and publicly available word embeddings, performs competitively on the CoNLL-2003 dataset and surpasses previous state-of-the-art performance on the OntoNotes 5.0 dataset by 2.13 F1 points. By using two lexicons constructed from publicly available sources, the system achieves new state-of-the-art performance with F1 scores of 91.62 on CoNLL-2003 and 86.28 on OntoNotes, outperforming systems that rely on heavy feature engineering, proprietary lexicons, and rich entity linking information. The paper also discusses the effectiveness of different word embeddings and the impact of dropout on model performance.