Aspect-based implicit sentiment analysis model based on BERT and attention mechanism
DOI:
Author:
Affiliation:

Automation Institute,Nanjing University of Information Science Technology

Clc Number:

TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    There are quite a few comment sentences without emotional words in aspect-level emotional texts, and the study of their emotions is called aspect-level implicit emotional analysis. The existing models have the problems that the context information related to aspect words may be lost in the pre-training process, and the deep features in the context cannot be accurately extracted. Aiming at the first problem, this paper constructs a aspect-aware BERT pre-training model, and introduces aspect words into the input embedding structure of basic BERT to generate word vectors related to aspect words. Aiming at the second problem, this paper constructs a context-aware attention mechanism. For the deep hidden vectors obtained from the coding layer, the semantic and syntactic information is introduced into the attention weight calculation process, so that the attention mechanism can more accurately assign the weight to the context related to aspect words. The results of comparative experiments show that the effect of this model is better than that of the baseline model.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:September 14,2022
  • Revised:October 01,2022
  • Adopted:November 07,2022
  • Online:
  • Published:

Address:No. 219, Ningliu Road, Nanjing, Jiangsu Province

Postcode:210044

Phone:025-58731025