qwerrwe / src /axolotl /utils /lora_embeddings.py
winglian's picture
be more robust about checking embedding modules for lora finetunes (#1074) [skip ci]
0f10080 unverified
raw
history blame
299 Bytes
"""
helpers for lora embeddings
"""
def get_linear_embedding_layers(model_type):
"""
returns the linear embedding layers needed for loras, dependent on the model arch
"""
if model_type == "phi-msft":
return ["embd", "lm_head.linear"]
return ["lm_head", "embed_tokens"]