Abstract:There are quite a few comment sentences without emotional words in aspect-level emotional texts, and the study of their emotions is called aspect-level implicit emotional analysis. The existing models have the problems that the context information related to aspect words may be lost in the pre-training process, and the deep features in the context cannot be accurately extracted. Aiming at the first problem, this paper constructs a aspect-aware BERT pre-training model, and introduces aspect words into the input embedding structure of basic BERT to generate word vectors related to aspect words. Aiming at the second problem, this paper constructs a context-aware attention mechanism. For the deep hidden vectors obtained from the coding layer, the semantic and syntactic information is introduced into the attention weight calculation process, so that the attention mechanism can more accurately assign the weight to the context related to aspect words. The results of comparative experiments show that the effect of this model is better than that of the baseline model.