Update README.md
Browse files
README.md
CHANGED
@@ -7,4 +7,36 @@ sdk: static
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
-
Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
pinned: false
|
8 |
---
|
9 |
|
10 |
+
The Knowledge Engineering Group (**KEG**) & Data Mining (THUDM) at Tsinghua University.
|
11 |
+
|
12 |
+
We build LLMs:
|
13 |
+
|
14 |
+
* **[ChatGLM](https://github.com/THUDM/ChatGLM3)**: Open Bilingual Chat LLMs, among which the ChatGLM-6B series has attracted **10,000,000** downloads on HF.
|
15 |
+
* **[CodeGeeX](https://github.com/THUDM/CodeGeeX2)**: A Multilingual Code Generation Model (KDD 2023)
|
16 |
+
* **[CogVLM (VisualGLM)](https://github.com/THUDM/CogVLM)**: An Open Visual Language Model
|
17 |
+
* **[WebGLM](https://github.com/THUDM/WebGLM)**: An Efficient Web-Enhanced Question Answering System (KDD 2023)
|
18 |
+
* **[GLM-130B](https://github.com/THUDM/GLM-130B)**: An Open Bilingual Pre-Trained Model (ICLR 2023)
|
19 |
+
* **[CogView](https://github.com/THUDM/CogView)**: An Open Text-to-Image Generation Model (NeurIPS 2021)
|
20 |
+
* **[CogVideo](https://github.com/THUDM/CogVideo)**: An Open Text-to-Video Generation Model (ICLR 2023)
|
21 |
+
* **[AgentTuning](https://github.com/THUDM/AgentTuning)**: Enabling Generalized Agent Abilities for LLMs
|
22 |
+
|
23 |
+
We also work on LLM evaluations:
|
24 |
+
* **[AgentBench](https://github.com/THUDM/AgentBench)**: A Benchmark to Evaluate LLMs as Agents
|
25 |
+
* **[LongBench](https://github.com/THUDM/LongBench)**: A Bilingual, Multitask Benchmark for Long Context Understanding
|
26 |
+
|
27 |
+
|
28 |
+
We also pre-train graph neural networks:
|
29 |
+
* **[CogDL](https://github.com/THUDM/CogDL)**: A Library for Graph Deep Learning (WWW 2023)
|
30 |
+
* **[GraphMAE](https://github.com/THUDM/GraphMAE)**: (Generative) Masked Graph Neural Network Pre-Training. (KDD 2022 & [WWW 2023](https://github.com/THUDM/GraphMAE2))
|
31 |
+
* **[GPT-GNN](https://github.com/acbull/GPT-GNN)**: Generative Graph Neural Network Pre-Training (KDD 2020, MSR, UCLA).
|
32 |
+
* **[GCC](https://github.com/THUDM/CogDL)**: Constrative Graph Neural Network Pre-Training (KDD 2020)
|
33 |
+
* **[SelfKG](https://github.com/THUDM/SelfKG)**: Self-Supervised Learning for Knowledge Graphs (WWW 2022)
|
34 |
+
|
35 |
+
We also work on graph embedding theory and system:
|
36 |
+
* **[SketchNE](https://github.com/THU-numbda/SketchNE)**: Embedding Billion-Scale Networks Accurately in One Hour (TKDE 2023)
|
37 |
+
* **[ProNE](https://github.com/THUDM/ProNE)**: Embedding Networks of 100 Million Nodes with 10-400 Speedup (IJCAI 2019)
|
38 |
+
* **[NetSMF](https://github.com/xptree/NetSMF)**: Embedding Networks of 100 Million Nodes (WWW 2019)
|
39 |
+
* **[NetMF](https://github.com/xptree/NetMF)**: Understanding DeepWalk, LINE, PTE, and node2vec as Matrix Factorization (WSDM 2018)
|
40 |
+
|
41 |
+
We started with graphs and networks, and always love them:
|
42 |
+
* **[AMiner](https://www.aminer.cn/)**: An Academic Search and Mining System Since 2006 (KDD 2008, ACM SIGKDD Test of Time Award)
|