May 2020 – Aug 2020

Research Intern

JD AI Research, Mountain View, USA · Supervisor: Dr. Jing Huang & Dr. Guangtao Wang · Mountain View, USA

Research Focus

During my internship at JD AI Research, I focused on commonsense knowledge graph completion and inductive learning for knowledge representation. Collaboration with Stanford University.

Supervisors: Dr. Jing Huang and Dr. Guangtao Wang.

Project Overview

The research addressed a critical challenge in commonsense knowledge graphs (CKGs): how to perform knowledge graph completion for entities unseen during training (inductive learning). Traditional methods required retraining when new entities were added, which is impractical for large-scale CKGs.

We developed InductivE, a novel learning framework for inductive knowledge graph completion. The framework consists of three key components: a free-text encoder for processing entity descriptions, a graph encoder for capturing structural information, and a KG completion decoder for predicting missing relations.

Technical Work

  • Designed and implemented the InductivE framework for inductive commonsense knowledge graph completion
  • Developed free-text encoder using pre-trained language models to encode entity descriptions
  • Built graph neural network encoder to capture structural information from the knowledge graph
  • Implemented knowledge graph completion decoder for relation prediction
  • Conducted extensive experiments on ATOMIC and ConceptNet benchmarks

Achievements

  • Achieved over 48% improvement in inductive scenarios compared to existing methods
  • State-of-the-art results on ATOMIC and ConceptNet benchmarks
  • Published research paper at IJCNN 2021 (International Joint Conference on Neural Networks)
  • Framework enables handling of unseen entities without model retraining

Publication