--- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1 # Doc / guide: https://huggingface.co/docs/hub/model-cards {} --- # Model Card for pcqm4mv1_graphormer_base The Graphormer is a graph classification model. # Model Details ## Model Description The Graphormer is a graph Transformer model, pretrained on PCQM4M-LSC, and which got 1st place on the KDD CUP 2021 (quantum prediction track). - **Developed by:** [Microsoft] - **Model type:** [Graphormer] - **License:** [MIT] ## Model Sources [optional] - **Repository:** [https://github.com/microsoft/Graphormer] - **Paper:** [https://arxiv.org/abs/2106.05234] - **Documentation:** [https://graphormer.readthedocs.io/en/latest/] # Uses ## Direct Use This model should be used for graph classification tasks or graph representation tasks; the most likely associated task is molecule modeling. It can either be used as such, or finetuned on downstream tasks. # Bias, Risks, and Limitations The Graphormer model is ressource intensive for large graphs, and might lead to OOM errors. ## How to Get Started with the Model See the Graph Classification with Transformers tutorial. # Citation [optional] **BibTeX:** @article{DBLP:journals/corr/abs-2106-05234, author = {Chengxuan Ying and Tianle Cai and Shengjie Luo and Shuxin Zheng and Guolin Ke and Di He and Yanming Shen and Tie{-}Yan Liu}, title = {Do Transformers Really Perform Bad for Graph Representation?}, journal = {CoRR}, volume = {abs/2106.05234}, year = {2021}, url = {https://arxiv.org/abs/2106.05234}, eprinttype = {arXiv}, eprint = {2106.05234}, timestamp = {Tue, 15 Jun 2021 16:35:15 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-05234.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} }