Representation learning on a knowledge graph aims to capture patterns in the knowledge graph as low-dimensional dense distributed representation vectors in the continuous semantic space, which is a powerful technique for predicting missing links in knowledge bases. The problem of knowledge base completion can be viewed as predicting new triples based on the existing ones. One of the prominent approaches in knowledge base completion is the embedding model. Currently, the majority of existing knowledge graph embedding models cannot deal with unbalanced entities and relations. In this paper, a new embedding model is proposed, with a general solution instead of using the additional corpus. First, a triple-based neural network is presented to maximize the likelihood of the knowledge bases finding a low-dimensional embedding space. Second, two procedures to generate positive triples are proposed. They produce positive triples and add them to the training data. The policies can capture rare triples, and simultaneously remain efficient to compute. Experiments show that the embedded model proposed in this paper has superior performance.
Haghani, S., & Keyvanpour, M. R. (2022). Embedding Knowledge Graph through Triple Base Neural Network and Positive Samples. Computer and Knowledge Engineering, 5(2), 11-20. doi: 10.22067/cke.2022.73802.1038
MLA
Sogol Haghani; Mohammad Reza Keyvanpour. "Embedding Knowledge Graph through Triple Base Neural Network and Positive Samples", Computer and Knowledge Engineering, 5, 2, 2022, 11-20. doi: 10.22067/cke.2022.73802.1038
HARVARD
Haghani, S., Keyvanpour, M. R. (2022). 'Embedding Knowledge Graph through Triple Base Neural Network and Positive Samples', Computer and Knowledge Engineering, 5(2), pp. 11-20. doi: 10.22067/cke.2022.73802.1038
VANCOUVER
Haghani, S., Keyvanpour, M. R. Embedding Knowledge Graph through Triple Base Neural Network and Positive Samples. Computer and Knowledge Engineering, 2022; 5(2): 11-20. doi: 10.22067/cke.2022.73802.1038
Send comment about this article