• 首页
  • 期刊简介
  • 编委会
  • 投稿指南
  • 收录情况
  • 杂志订阅
  • 联系我们
引用本文:齐 勇,沈 薇.基于聚合局部邻居三元组与特征增强注意力的知识图谱表示[J].软件工程,2024,27(12):56-62.【点击复制】
【打印本页】   【下载PDF全文】   【查看/发表评论】  【下载PDF阅读器】  
←前一篇|后一篇→ 过刊浏览
分享到: 微信 更多
基于聚合局部邻居三元组与特征增强注意力的知识图谱表示
齐 勇, 沈 薇
(陕西科技大学电子信息与人工智能学院, 陕西 西安 710021)
qiyong@sust.edu.cn; 221612163@sust.edu.cn
摘 要: 针对Transformer知识表示模型中三元组语义和结构关联信息的缺失问题,提出了新的知识图谱表示框架CNAR。首先在Transformer的基础上设计了聚合局部邻居三元组技术,有效丰富了语义结构信息的多样性,并利用特征增强注意力设计权重,精准表征三元组的关联程度。此方法已成功应用到下游任务,并在4个数据集(包括自制机器人数据集ROBOT)上进行了验证。实验结果显示,在WN18RR和FB15K-237数据集上,该方法获得的MRR 指标较基线方法的平均水平分别提高了10.9百分点和17.1百分点,Hits@10指标分别提高了13.1百分点和保持平衡表现。此外,在UMLS和ROBOT数据集上,该方法的两个指标也达到或接近最优性能,证明了该方法的有效性和适用性。
关键词: 语言模型;Transformer;聚合局部邻居三元组;特征增强注意力
中图分类号: TP391    文献标识码: A
基金项目: 陕西省教育服务厅地方专项计划项目(22JC019)
Knowledge Graph Representation Based on Aggregated Local Neighbor Triples and Feature-Enhanced Attention
QI Yong, SHEN Wei
(School of Electronic Inf ormation and Artif icial Intelligence, Shaanxi University of Science & Technology, X'i an 710021, China)
qiyong@sust.edu.cn; 221612163@sust.edu.cn
Abstract: A new knowledge graph representation framework CNAR is proposed to address the issue of missing semantic and structural association information of triples in the Transformer knowledge representation model. Firstly, an aggregated local neighbor triples technique based on Transformer is designed, which effectively enriches the diversity of semantic structure information. It then employs feature-enhanced attention to design weights that accurately represent the degree of association among triples. This method has been successfully applied to downstream tasks and validated on four datasets (including the self-made robot dataset ROBOT). Experimental results show that on the WN18RR and FB15K-237 datasets, the MRR metrics achieved by this method improve by an average of 10.9 percentage points and 17.1 percentage points respectively compared to baseline methods, while the Hits @ 10 metric improves by 13. 1 percentage points and maintains balanced performance. Furthermore, on the UMLS and ROBOT datasets, the two metrics of the proposed method also reach or approach optimal performance, demonstrating the effectiveness and applicability of the method.
Keywords: language models; Transformer; aggregated local neighbor triples; feature-enhanced attention


版权所有:软件工程杂志社
地址:辽宁省沈阳市浑南区新秀街2号 邮政编码:110179
电话:0411-84767887 传真:0411-84835089 Email:semagazine@neusoft.edu.cn
备案号:辽ICP备17007376号-1
技术支持:北京勤云科技发展有限公司

用微信扫一扫

用微信扫一扫