This paper presents two target-dependent long short-term memory (LSTM) models for target-dependent sentiment classification. The models incorporate target information to improve classification accuracy. The first model, TD-LSTM, uses two LSTM networks to model preceding and following contexts around the target. The second model, TC-LSTM, extends TD-LSTM by explicitly incorporating the connections between the target and its context words. These models are trained end-to-end with standard backpropagation and achieve state-of-the-art performance on a benchmark dataset from Twitter without using syntactic parsers or external sentiment lexicons. Empirical results show that incorporating target information into LSTM significantly improves classification accuracy. The TC-LSTM model performs best, outperforming other baseline methods. The paper also evaluates the effects of different word embeddings and finds that TC-LSTM performs best when using specific word embeddings. The models are shown to effectively capture the semantic relatedness between target words and their context words, leading to improved sentiment classification performance. The study highlights the importance of target information in sentiment classification and demonstrates the effectiveness of neural network approaches in capturing this information.This paper presents two target-dependent long short-term memory (LSTM) models for target-dependent sentiment classification. The models incorporate target information to improve classification accuracy. The first model, TD-LSTM, uses two LSTM networks to model preceding and following contexts around the target. The second model, TC-LSTM, extends TD-LSTM by explicitly incorporating the connections between the target and its context words. These models are trained end-to-end with standard backpropagation and achieve state-of-the-art performance on a benchmark dataset from Twitter without using syntactic parsers or external sentiment lexicons. Empirical results show that incorporating target information into LSTM significantly improves classification accuracy. The TC-LSTM model performs best, outperforming other baseline methods. The paper also evaluates the effects of different word embeddings and finds that TC-LSTM performs best when using specific word embeddings. The models are shown to effectively capture the semantic relatedness between target words and their context words, leading to improved sentiment classification performance. The study highlights the importance of target information in sentiment classification and demonstrates the effectiveness of neural network approaches in capturing this information.