can you provide wikitest ppl and c4 ppl separately?
#11 opened 9 months ago
by
sheropen-2
Can you provide more details on the training?
1
#10 opened 9 months ago
by
dequ777
Any plans to use MQA (multi-query attention) or GQA (grouped-query attention) in the future?
#9 opened 10 months ago
by
graefics
Efficient Inference Kernel Support for 1.58bit.
#8 opened 10 months ago
by
LeiWang1999

This code from BitLinear doesn't make sense
1
#7 opened 10 months ago
by
qmsoqm

Is it bitnet {-1,0,1}?
4
#6 opened 11 months ago
by
Remek

ValueError: Tokenizer class BitnetTokenizer does not exist or is not currently imported.
4
#5 opened 11 months ago
by
RZJournal
Longer inference time
2
#4 opened 11 months ago
by
dittops

Why are these models fp32?
5
#2 opened 11 months ago
by
supercharge19
Is there a chat/instruct model in plans?
2
#1 opened 11 months ago
by
MrVodnik