Hi, welcome to my Github 👋
I am Xiao Liu, a second-year master CS student in Tsinghua University, Knowledge Engineering Group (KEG).
-
🔭 Interested in Machine Learning, Data Mining, NLP and Knowledge Graph. -
🌱 Find my up-to-date publication list in Google Scholar! Some of my proud works:Large Language Model (LLM) Pre-training and Transfer Learning
- P-tuning and P-tuning v2 (ACL'22): pioneer works on prompt tuning
- GLM-130B (ICLR'23): probably the best open-sourced LLM so far; an open bilingual (Enligsh & Chinese) pre-trained model with 130 billion parameters based on GLM (ACL'22); better than GPT-3 175B on LAMBADA and MMLU.
- ChatGLM-6B: an open bilingual dialogue language model that requires only 6GB to run. Receiving
.
Knowledge Graph Construction and Reasoning
- SelfKG (WWW'22): self-supervised alignment can be comparable to supervised ones, Best Paper Nominee in WWW 2022.
- kgTransformer (KDD'22): pre-training knowledge graph transformers for complex logical reasoning
Self-supervised Learning and its Applications
- Self-supervised Learning: Generative or Contrastive (TKDE'21): one of the most cited survey on self-supervised learning
-
🤔 Dedicated to building web-scale knowledge systems via both Large Pre-trained Model and Symbolic Graph Reasoning. -
💬 Feel free to drop me an email for:- Any form of collaboration
- Any issue about my works or code
- Interesting ideas to discuss or just chatting





