Commit History
Merge pull request #187 from OpenAccess-AI-Collective/strip-peft-device-map
		93dacba
	
		
		unverified
	Merge pull request #177 from NanoCode012/fix/landmark-patch
		8002ffb
	
		
		unverified
	Merge pull request #192 from OpenAccess-AI-Collective/sharegpt-custom-prompt
		74ef5cc
	
		
		unverified
	Merge branch 'main' into strip-peft-device-map
		5e616d9
	
		
		unverified
	Merge pull request #193 from OpenAccess-AI-Collective/config-fixes-20230612
		94f310c
	
		
		unverified
	Merge pull request #159 from AngainorDev/patch-1
		8e568bb
	
		
		unverified
	Merge pull request #194 from NanoCode012/fix/config-path
		e21dab4
	
		
		unverified
	Fix config path after config moved
		52cde69
	
		
		
	config fixes
		9a58e99
	
		
		
	add typehints
		c7dee56
	
		
		
	add new sharegpt, refactor prompt so it can be customized later, add exception if no data is processed
		aac4b76
	
		
		
	Merge pull request #191 from OpenAccess-AI-Collective/NanoCode012-patch-1
		f31a338
	
		
		unverified
	Add save_steps and eval_steps to Readme
		4cd1dee
	
		
		unverified
	Merge pull request #190 from OpenAccess-AI-Collective/fixes-20230711-v2
		9ac16ed
	
		
		unverified
	forgot to add this file
		6b3f509
	
		
		
	gptq lora llama is obviously good
		336aa3f
	
		
		
	update openllama and clean up paths
		d0d7eaa
	
		
		
	fix table formatting
		a6ebf57
	
		
		
	more matrix updates
		280832c
	
		
		
	update the support matrix
		a43bae9
	
		
		
	more pruning
		effbbf6
	
		
		
	more config pruning and migrating
		c530e4b
	
		
		
	Merge pull request #189 from OpenAccess-AI-Collective/fixes-20230711
		f620706
	
		
		unverified
	get rid of some configs, formalize pythioa lora config
		77762a5
	
		
		
	new validation for mpt w grad checkpoints
		14668fa
	
		
		
	Fix strict and Lint
		b565ecf
	
		
		
	match up gradient checkpointing when using lora w config
		fe0b768
	
		
		
	Merge pull request #186 from akj2018/main
		e944311
	
		
		unverified
	Update FAQS.md
		e3e7b52
	
		
		unverified
	Fix set mem_id for inference and refactor
		974dc00
	
		
		
	Set mem cache args on inference
		572d114
	
		
		
	Clean up landmark patching
		a6190c8
	
		
		
	Fix undefined LlamaForCausalLM and del try except
		563b6d8
	
		
		
	peft no longer needs device_map
		cd0a6f6
	
		
		
	Update FAQS.md
		dd7d16d
	
		
		unverified
	Refactor landmark attention patch
		919727b
	
		
		
	Update FAQS.md
		5ffefee
	
		
		unverified
	Merge pull request #183 from OpenAccess-AI-Collective/inference-from-stdin
		d9f713e
	
		
		unverified
	pass a prompt in from stdin for inference
		c4e4f81
	
		
		
	Fix missing cfg.
		a808bf9
	
		
		unverified
	
		Angainor Development
		
	commited on
		
		
Merge pull request #182 from OpenAccess-AI-Collective/fix-llama-ref
		0124825
	
		
		unverified
	fix for local variable 'LlamaForCausalLM' referenced before assignment
		14163c1
	
		
		
	Merge pull request #181 from OpenAccess-AI-Collective/xpos-rope
		41e4f6c
	
		
		unverified
	Merge branch 'main' into patch-1
		79e2a6f
	
		
		unverified
	
		Angainor Development
		
	commited on
		
		
Remove explicit definition of cfg.inference
		c250898
	
		
		unverified
	
		Angainor Development
		
	commited on