Commit History
fix toc
5760099
Pretrain multipack v2 (#1470)
5aa5097
unverified
Added pip install ninja to accelerate installation of flash-attn (#1461)
cae608f
unverified
fix pretraining_ on odd datasets (#1463)
586bd8d
unverified
Reorganize Docs (#1468)
86b7d22
unverified
reduce verbosity of the special tokens (#1472)
0b10377
unverified
make sure to install causal_conv1d in docker (#1459)
89134f2
unverified
qwen2_moe support w multipack (#1455)
6086be8
unverified
Nightlies fix v4 (#1458) [skip ci]
4a92a3b
unverified
fix yaml parsing for workflow (#1457) [skip ci]
46a73e3
unverified
fix how nightly tag is generated (#1456) [skip ci]
da3415b
unverified
configure nightly docker builds (#1454) [skip ci]
8cb127a
unverified
fix some of the edge cases for Jamba (#1452)
05b398a
unverified
Support loading datasets saved via save_to_disk (#1432)
e634118
unverified
Jamba (#1451)
02af082
unverified
fix layer_replication arg to peft (#1446)
4155e99
unverified
support layer replication for peft and fix rslora integration (#1445)
25afd35
unverified
fix for accelerate env var for auto bf16, add new base image and expand torch_cuda_arch_list support (#1413)
da265dd
unverified
Fix falcon tokenization step (#1441) [skip ci]
bcdc9b1
unverified
turn sample_packing on for training (#1438) [skip ci]
c19d060
unverified
make sure to capture non-null defaults from config validation (#1415)
601b77b
unverified
fix(dataset): normalize tokenizer config and change hash from tokenizer class to tokenizer path (#1298)
ff939d8
unverified
docs: update link to docs of advance topic in README.md (#1437)
324d59e
unverified
chore(config): refactor old mistral config (#1435)
f1ebaa0
unverified
Fix ORPO multi gpu (#1433)
34ba634
unverified
Update docs.yml
4e69aa4
unverified
Bootstrap Hosted Axolotl Docs w/Quarto (#1429)
629450c
unverified
strip out hacky qlora-fsdp workarounds now that qlora-fsdp fixes are upstreamed (#1428)
2a1589f
unverified
HF / FEAT: Optimize HF tags (#1425) [skip ci]
7d55607
unverified
fixes for dpo and orpo template loading (#1424)
7803f09
unverified
support galore once upstreamed into transformers (#1409)
dd449c5
unverified
Feat: Add sharegpt multirole (#1137)
40a88e8
unverified
fix(config): passing gradient_checkpoint_kwargs (#1412)
b1e3e1b
unverified
ORPO (#1419)
2ea70eb
unverified
Update README.md (#1418)
e8c8ea6
unverified
jbl
commited on