JD AI Research (2020)
Position
Research Intern (05/2020 - 08/2020), JD AI Research, Mountain View, USA.
Supervised by Dr. Jing Huang & Dr. Guangtao Wang (JD AI Research), in collaboration with Stanford University.
Research
During my internship at JD AI Research, I focused on commonsense knowledge graph completion and inductive learning for knowledge representation.
The research addressed a critical challenge in commonsense knowledge graphs (CKGs): how to perform knowledge graph completion for entities unseen during training (inductive learning). Traditional methods required retraining when new entities were added, which is impractical for large-scale CKGs.
We developed InductivE, a novel learning framework for inductive knowledge graph completion. The framework consists of three key components: a free-text encoder for processing entity descriptions, a graph encoder for capturing structural information, and a KG completion decoder for predicting missing relations.
Technical Work
- Designed and implemented the InductivE framework for inductive commonsense knowledge graph completion
- Developed free-text encoder using pre-trained language models to encode entity descriptions
- Built graph neural network encoder to capture structural information from the knowledge graph
- Implemented knowledge graph completion decoder for relation prediction
- Conducted extensive experiments on ATOMIC and ConceptNet benchmarks
Key Achievements
- Achieved over 48% improvement in inductive scenarios compared to existing methods
- State-of-the-art results on ATOMIC and ConceptNet benchmarks
- Published research paper at IJCNN 2021 (International Joint Conference on Neural Networks)
- Framework enables handling of unseen entities without model retraining
Publication
Inductive Learning on Commonsense Knowledge Graph Completion
Bin Wang, Guangtao Wang, Jing Huang, Jiaxuan You, Jure Leskovec, C.-C. Jay Kuo
IJCNN (International Joint Conference on Neural Networks), 2021 · arXiv