|
Research on primary equipment defect diagnosis method based on the BERT model |
DOI:10.19783/j.cnki.pspc.240485 |
Key Words:defect diagnosis large language model BERT prompt learning classification method |
Author Name | Affiliation | YANG Hong | State Grid Shanxi Electric Power Research Institute, Taiyuan 030002, China | MENG Xiaokai | State Grid Shanxi Electric Power Research Institute, Taiyuan 030002, China | YU Hua | State Grid Shanxi Electric Power Research Institute, Taiyuan 030002, China | BAI Yang | State Grid Shanxi Electric Power Research Institute, Taiyuan 030002, China | HAN Yu | State Grid Shanxi Electric Power Research Institute, Taiyuan 030002, China | LIU Yongxin | State Grid Shanxi Electric Power Research Institute, Taiyuan 030002, China |
|
Hits: 348 |
Download times: 61 |
Abstract:Primary equipment defect diagnosis aims to promptly locate and address abnormal situations in the power grid, serving as a foundation for the stable operation of the power system. Traditional methods rely heavily on manual efforts, leading to low efficiency, high diagnostic costs, and dependence on expert experience. To overcome these limitations, this paper proposes a primary equipment defect diagnosis method based on language models such as BERT. First, the BERT model is employed to preliminarily comprehend the input and obtain embedded representations, which are then used in the defect level classification task to assess the severity of the defect. Subsequently, a large language model is utilized to consolidate the input information and classification results, improving the accuracy and reasoning reliability of the knowledge-based Q&A process through prompt learning, thereby providing correct and effective answers. Finally, the potential applications of large language models in the power industry are explored. Experimental results demonstrate outstanding performance of this method in both defect level classification and question-answering tasks, generating high-quality classification evidence and guidance information. |
View Full Text View/Add Comment Download reader |
|
|
|