-
Attention Is All You Need
Paper • 1706.03762 • Published • 49 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 16 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 14
Collections
Discover the best community collections!
Collections including paper arxiv:2303.17564
-
FinTral: A Family of GPT-4 Level Multimodal Financial Large Language Models
Paper • 2402.10986 • Published • 78 -
BloombergGPT: A Large Language Model for Finance
Paper • 2303.17564 • Published • 21 -
GPT-InvestAR: Enhancing Stock Investment Strategies through Annual Report Analysis with Large Language Models
Paper • 2309.03079 • Published • 2 -
FinVis-GPT: A Multimodal Large Language Model for Financial Chart Analysis
Paper • 2308.01430 • Published • 2