mllm-dev commited on
Commit
ca0ea92
·
verified ·
1 Parent(s): 6e6ca62

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -1,10 +1,10 @@
1
  ---
2
  base_model:
3
- - mllm-dev/gpt2_f_experiment_4_drug_data_new_run
4
- - mllm-dev/gpt2_f_experiment_0_drug_data_new_run
5
- - mllm-dev/gpt2_f_experiment_1_drug_data_new_run
6
  - mllm-dev/gpt2_f_experiment_2_drug_data_new_run
 
 
7
  - mllm-dev/gpt2_f_experiment_3_drug_data_new_run
 
8
  library_name: transformers
9
  tags:
10
  - mergekit
@@ -23,10 +23,10 @@ This model was merged using the linear [DARE](https://arxiv.org/abs/2311.03099)
23
  ### Models Merged
24
 
25
  The following models were included in the merge:
26
- * [mllm-dev/gpt2_f_experiment_4_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_4_drug_data_new_run)
27
- * [mllm-dev/gpt2_f_experiment_1_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_1_drug_data_new_run)
28
  * [mllm-dev/gpt2_f_experiment_2_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_2_drug_data_new_run)
 
29
  * [mllm-dev/gpt2_f_experiment_3_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_3_drug_data_new_run)
 
30
 
31
  ### Configuration
32
 
@@ -51,7 +51,7 @@ slices:
51
  - layer_range: [0, 12]
52
  model: mllm-dev/gpt2_f_experiment_2_drug_data_new_run
53
  parameters:
54
- weight: 0.4
55
  - layer_range: [0, 12]
56
  model: mllm-dev/gpt2_f_experiment_3_drug_data_new_run
57
  parameters:
@@ -59,5 +59,5 @@ slices:
59
  - layer_range: [0, 12]
60
  model: mllm-dev/gpt2_f_experiment_4_drug_data_new_run
61
  parameters:
62
- weight: 0.15
63
  ```
 
1
  ---
2
  base_model:
 
 
 
3
  - mllm-dev/gpt2_f_experiment_2_drug_data_new_run
4
+ - mllm-dev/gpt2_f_experiment_0_drug_data_new_run
5
+ - mllm-dev/gpt2_f_experiment_4_drug_data_new_run
6
  - mllm-dev/gpt2_f_experiment_3_drug_data_new_run
7
+ - mllm-dev/gpt2_f_experiment_1_drug_data_new_run
8
  library_name: transformers
9
  tags:
10
  - mergekit
 
23
  ### Models Merged
24
 
25
  The following models were included in the merge:
 
 
26
  * [mllm-dev/gpt2_f_experiment_2_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_2_drug_data_new_run)
27
+ * [mllm-dev/gpt2_f_experiment_4_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_4_drug_data_new_run)
28
  * [mllm-dev/gpt2_f_experiment_3_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_3_drug_data_new_run)
29
+ * [mllm-dev/gpt2_f_experiment_1_drug_data_new_run](https://huggingface.co/mllm-dev/gpt2_f_experiment_1_drug_data_new_run)
30
 
31
  ### Configuration
32
 
 
51
  - layer_range: [0, 12]
52
  model: mllm-dev/gpt2_f_experiment_2_drug_data_new_run
53
  parameters:
54
+ weight: 0.35
55
  - layer_range: [0, 12]
56
  model: mllm-dev/gpt2_f_experiment_3_drug_data_new_run
57
  parameters:
 
59
  - layer_range: [0, 12]
60
  model: mllm-dev/gpt2_f_experiment_4_drug_data_new_run
61
  parameters:
62
+ weight: 0.2
63
  ```
mergekit_config.yml CHANGED
@@ -16,7 +16,7 @@ slices:
16
  - layer_range: [0, 12]
17
  model: mllm-dev/gpt2_f_experiment_2_drug_data_new_run
18
  parameters:
19
- weight: 0.4
20
  - layer_range: [0, 12]
21
  model: mllm-dev/gpt2_f_experiment_3_drug_data_new_run
22
  parameters:
@@ -24,4 +24,4 @@ slices:
24
  - layer_range: [0, 12]
25
  model: mllm-dev/gpt2_f_experiment_4_drug_data_new_run
26
  parameters:
27
- weight: 0.15
 
16
  - layer_range: [0, 12]
17
  model: mllm-dev/gpt2_f_experiment_2_drug_data_new_run
18
  parameters:
19
+ weight: 0.35
20
  - layer_range: [0, 12]
21
  model: mllm-dev/gpt2_f_experiment_3_drug_data_new_run
22
  parameters:
 
24
  - layer_range: [0, 12]
25
  model: mllm-dev/gpt2_f_experiment_4_drug_data_new_run
26
  parameters:
27
+ weight: 0.2
model-00001-of-00001.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c249927d7fcf4cc8528bc39a2df9881dbe5182c9f36045d32cff1a8fb530dcdb
3
  size 248909944
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:851a042313d93551fb5a98fdcf23071e1c60d8d7d9e5859ce532dab1eac535a3
3
  size 248909944