|
Short-term transformer error prediction based on MHA-CNN-SLSTM and error compensation |
DOI:10.19783/j.cnki.pspc.240256 |
Key Words:ultra-short-term forecasting VMD multi-head attention mechanism LSTM error compensation |
Author Name | Affiliation | CHEN Haoyu1,2 | 1. Hubei Provincial Key Laboratory for Operation and Control of Cascaded Hydropower Station, China Three Gorges University,
Yichang 443002, China 2. College of Electrical Engineering & New Energy, China Three Gorges University, Yichang 443002,
China 3. National High Magnetic Field Center, Huazhong University of Science and Technology, Wuhan 430074, China | LI Zhenhua1,2 | 1. Hubei Provincial Key Laboratory for Operation and Control of Cascaded Hydropower Station, China Three Gorges University,
Yichang 443002, China 2. College of Electrical Engineering & New Energy, China Three Gorges University, Yichang 443002,
China 3. National High Magnetic Field Center, Huazhong University of Science and Technology, Wuhan 430074, China | ZHANG Shaozhe3 | 1. Hubei Provincial Key Laboratory for Operation and Control of Cascaded Hydropower Station, China Three Gorges University,
Yichang 443002, China 2. College of Electrical Engineering & New Energy, China Three Gorges University, Yichang 443002,
China 3. National High Magnetic Field Center, Huazhong University of Science and Technology, Wuhan 430074, China | CHENG Jiangzhou1,2 | 1. Hubei Provincial Key Laboratory for Operation and Control of Cascaded Hydropower Station, China Three Gorges University,
Yichang 443002, China 2. College of Electrical Engineering & New Energy, China Three Gorges University, Yichang 443002,
China 3. National High Magnetic Field Center, Huazhong University of Science and Technology, Wuhan 430074, China | LI Zhenxing2 | 1. Hubei Provincial Key Laboratory for Operation and Control of Cascaded Hydropower Station, China Three Gorges University,
Yichang 443002, China 2. College of Electrical Engineering & New Energy, China Three Gorges University, Yichang 443002,
China 3. National High Magnetic Field Center, Huazhong University of Science and Technology, Wuhan 430074, China | QIU Li2 | 1. Hubei Provincial Key Laboratory for Operation and Control of Cascaded Hydropower Station, China Three Gorges University,
Yichang 443002, China 2. College of Electrical Engineering & New Energy, China Three Gorges University, Yichang 443002,
China 3. National High Magnetic Field Center, Huazhong University of Science and Technology, Wuhan 430074, China |
|
Hits: 709 |
Download times: 138 |
Abstract:To improve the accuracy of instrument transformer error prediction, first, an antagonistic search operator strategy and nonlinear convergence control factor are introduced to improve the traditional seagull algorithm. A method based on the improved Seagull optimization algorithm (ISOA) to optimize the key parameters of variational mode decomposition (VMD) is proposed to realize the adaptive decomposition of error data. Then, based on the multi-head attention (MHA) mechanism, the cross-processing of error-influencing features is used to mine the correlation between each feature, and the deep relationship between the weakly correlated features and errors is established through the relationship between the strongly correlated features and the errors, so as to avoid the reduction of prediction accuracy caused by data waste. Considering the relationship between the training set and the test set, a long short-term memory (SLSTM) neural network considering the similarity of samples is proposed to dynamically adjust the network weights and biases. Based on this, the MHA-CNN-SLSTM prediction model is constructed, and the error between the predicted value and the actual value is re-input into the prediction model as the training set, and the compensation data is generated to compensate for the preliminary predicted value and further improve the predicted value. Finally, the measured data of a transformer is used to verify the results, and the results show that the proposed model has higher prediction accuracy and effect. |
View Full Text View/Add Comment Download reader |
|
|
|