Differentiating Concepts and Instances for Knowledge Graph Embedding

November 12, 2018 Β· Entered Twilight Β· πŸ› Conference on Empirical Methods in Natural Language Processing

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 5.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .DS_Store, README.md, data, py_version, src, vector

Authors Xin Lv, Lei Hou, Juanzi Li, Zhiyuan Liu arXiv ID 1811.04588 Category cs.AI: Artificial Intelligence Cross-listed cs.CL Citations 100 Venue Conference on Empirical Methods in Natural Language Processing Repository https://github.com/davidlvxin/TransC ⭐ 79 Last Checked 1 month ago
Abstract
Concepts, which represent a group of different instances sharing common properties, are essential information in knowledge representation. Most conventional knowledge embedding methods encode both entities (concepts and instances) and relations as vectors in a low dimensional semantic space equally, ignoring the difference between concepts and instances. In this paper, we propose a novel knowledge graph embedding model named TransC by differentiating concepts and instances. Specifically, TransC encodes each concept in knowledge graph as a sphere and each instance as a vector in the same semantic space. We use the relative positions to model the relations between concepts and instances (i.e., instanceOf), and the relations between concepts and sub-concepts (i.e., subClassOf). We evaluate our model on both link prediction and triple classification tasks on the dataset based on YAGO. Experimental results show that TransC outperforms state-of-the-art methods, and captures the semantic transitivity for instanceOf and subClassOf relation. Our codes and datasets can be obtained from https:// github.com/davidlvxin/TransC.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Artificial Intelligence