Abstract:In view of the different importance of input load characteristics to the decomposition results and the high decomposition error caused by the limited time dependence of LSTM in capturing long-term power consumption information, a non-intrusive load decomposition model based on multi-attention mechanism integration is proposed. Firstly, the probsparse self-attention mechanism is used to optimize the load characteristics extracted by one-dimensional dilated convolution. Then, the temporal pattern attention mechanism is used to give weight to the hidden state of LSTM, so as to enhance the learning ability of the network on the time dependence of long-term power consumption information. Finally, the validity of the proposed decomposition model is verified using the publicly available dataset UKDALE and REDD. Experimental results show that, compared with other decomposition algorithms, the decomposition algorithm based on multi-attention mechanism integration not only has the ability to select important load features, but also can correctly establish the time-dependent relationship between features and effectively reduce the decomposition error.