Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Firstname Lastname's picture
Hiring ๐Ÿ’ผ
47 2 22

Firstname Lastname

DrHouseFan-315
HailJebus's profile picture delords's profile picture RustyTake-Off's profile picture
ยท

AI & ML interests

None yet

Recent Activity

new activity 6 days ago
TuringsSolutions/Trading-Card-Trainer:Still starting
reacted to marksverdhei's post with ๐Ÿ˜” 9 days ago
Poll: Will 2026 be the year of subquadratic attention? The transformer architecture is cursed by its computational complexity. It is why you run out of tokens and have to compact. But some would argue that this is a feature not a bug and that this is also why these models are so good. We've been doing a lot of research on trying to make equally good models that are computationally cheaper, But so far, none of the approaches have stood the test of time. Or so it seems. Please vote, don't be shy. Remember that the Dunning-Kruger effect is very real, so the person who knows less about transformers than you is going to vote. We want everyone's opinion, no matter confidence. ๐Ÿ‘ if you think at least one frontier model* will have no O(n^2) attention by the end of 2026 ๐Ÿ”ฅ If you disagree * Frontier models - models that match / outperform the flagship claude, gemini or chatgpt at the time on multiple popular benchmarks
reacted to Alexander1337's post with ๐Ÿ”ฅ 10 days ago
host website
View all activity

Organizations

None yet

DrHouseFan-315 's models

None public yet
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs