Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,44 @@
|
|
| 1 |
-
---
|
| 2 |
-
license:
|
| 3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: cc-by-nc-4.0
|
| 3 |
+
---
|
| 4 |
+
|
| 5 |
+
<h1 align="center">HIGHT: Hierarchical Graph Tokenization for Graph-Language Alignment</h1>
|
| 6 |
+
<p align="center">
|
| 7 |
+
<a href="https://arxiv.org/abs/2406.14021"><img src="https://img.shields.io/badge/arXiv-2406.14021-b31b1b.svg" alt="Paper"></a>
|
| 8 |
+
<a href="https://github.com/LFhase/HIGHT"><img src="https://img.shields.io/badge/-Github-grey?logo=github" alt="Github"></a>
|
| 9 |
+
<!-- <a href="https://colab.research.google.com/drive/1t0_4BxEJ0XncyYvn_VyEQhxwNMvtSUNx?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Colab"></a> -->
|
| 10 |
+
<a href="https://arxiv.org/abs/2406.14021"> <img alt="License" src="https://img.shields.io/static/v1?label=Pub&message=ICML%2725&color=blue"> </a>
|
| 11 |
+
<!-- <a href="https://github.com/LFhase/HIGHT/blob/main/LICENSE"> <img alt="License" src="https://img.shields.io/github/license/LFhase/CIGA?color=blue"> </a> -->
|
| 12 |
+
<!-- <a href="https://icml.cc/virtual/2024/poster/3455"> <img src="https://img.shields.io/badge/Video-grey?logo=Kuaishou&logoColor=white" alt="Video"></a> -->
|
| 13 |
+
<!-- <a href="https://lfhase.win/files/slides/HIGHT.pdf"> <img src="https://img.shields.io/badge/Slides-grey?&logo=MicrosoftPowerPoint&logoColor=white" alt="Slides"></a> -->
|
| 14 |
+
<!-- <a href="https://icml.cc/media/PosterPDFs/ICML%202022/a8acc28734d4fe90ea24353d901ae678.png"> <img src="https://img.shields.io/badge/Poster-grey?logo=airplayvideo&logoColor=white" alt="Poster"></a> -->
|
| 15 |
+
</p>
|
| 16 |
+
|
| 17 |
+
|
| 18 |
+
This repo contains the model checkpoints of our ICML 2025 paper: *[Hierarchical Graph Tokenization for Molecule-Language Alignment](https://arxiv.org/abs/2406.14021)*, which has also been presented at ICML 2024 workshop on [Foundation Models in the Wild](https://icml.cc/virtual/2024/workshop/29954). 😆😆😆
|
| 19 |
+
|
| 20 |
+
## File Structures
|
| 21 |
+
|
| 22 |
+
The pretrained Hierarchical VQ-VAE model is stored in `hivqvae.pth`.
|
| 23 |
+
The checkpoints of graph-language models based on llama2-7b-chat and vicuna-v1-3-7b are contained in `/llama2` and `/vicuna`, respectively.
|
| 24 |
+
Inside each directory, the remaining checkpoints are organized as (using vicuna as an example):
|
| 25 |
+
- `llava-hvqvae2-vicuna-v1-3-7b-pretrain`: model after stage 1 pretraining;
|
| 26 |
+
- `graph-text-molgen`: models finetuned using Mol-Instruction data under different tasks, e.g., forward reaction prediction;
|
| 27 |
+
- `molcap-llava-hvqvae2-vicuna-v1-3-7b-finetune_lora-50ep`: model fintuned using CHEBI-20 dataset for molecular captioning;
|
| 28 |
+
- `MoleculeNet-llava-hvqvae2-vicuna-v1-3-7b-finetune_lora-large*`: models finetuned via different classification-based molecular property prediction tasks;
|
| 29 |
+
|
| 30 |
+
|
| 31 |
+
## Citation
|
| 32 |
+
|
| 33 |
+
|
| 34 |
+
If you find our model, paper and repo useful, please cite our paper:
|
| 35 |
+
|
| 36 |
+
```bibtex
|
| 37 |
+
@inproceedings{chen2025hierarchical,
|
| 38 |
+
title={Hierarchical Graph Tokenization for Molecule-Language Alignment},
|
| 39 |
+
author={Yongqiang Chen and Quanming Yao and Juzheng Zhang and James Cheng and Yatao Bian},
|
| 40 |
+
booktitle={Forty-second International Conference on Machine Learning},
|
| 41 |
+
year={2025},
|
| 42 |
+
url={https://openreview.net/forum?id=wpbNczwAwV}
|
| 43 |
+
}
|
| 44 |
+
```
|