SmartBERT V1 RoBERTa (2022)
Overview
This smart contract pre-trained model is used to transfer smart contract function-level code to embeddings.
It is trained by Sen Fang in 2022 on over 40,000 smart contracts.
Initialized with RoBERTa
Please update to SmartBERT V2
Citations
@article{huang2025smart,
title={Smart Contract Intent Detection with Pre-trained Programming Language Model},
author={Huang, Youwei and Li, Jianwen and Fang, Sen and Li, Yao and Yang, Peng and Hu, Bin and Zhang, Tao},
journal={arXiv preprint arXiv:2508.20086},
year={2025}
}
Thanks
- Downloads last month
- 10
Model tree for web3se/SmartBERT
Base model
FacebookAI/roberta-base