noneUsername commited on
Commit
411675c
·
verified ·
1 Parent(s): a287941

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +90 -0
README.md ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - ByteDance-Seed/Seed-Coder-8B-Instruct
4
+ ---
5
+ vllm (pretrained=/root/autodl-tmp/Seed-Coder-8B-Instruct,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
6
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
7
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
8
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.576|± |0.0313|
9
+ | | |strict-match | 5|exact_match|↑ |0.576|± |0.0313|
10
+
11
+ vllm (pretrained=/root/autodl-tmp/Seed-Coder-8B-Instruct,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
12
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
13
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
14
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.602|± |0.0219|
15
+ | | |strict-match | 5|exact_match|↑ |0.598|± |0.0219|
16
+
17
+ vllm (pretrained=/root/autodl-tmp/Seed-Coder-8B-Instruct,add_bos_token=true,max_model_len=3048,dtype=bfloat16), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: auto
18
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
19
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
20
+ |mmlu | 2|none | |acc |↑ |0.4386|± |0.0167|
21
+ | - humanities | 2|none | |acc |↑ |0.4000|± |0.0343|
22
+ | - other | 2|none | |acc |↑ |0.4872|± |0.0356|
23
+ | - social sciences| 2|none | |acc |↑ |0.4389|± |0.0364|
24
+ | - stem | 2|none | |acc |↑ |0.4316|± |0.0288|
25
+
26
+
27
+ vllm (pretrained=/root/autodl-tmp/80-128,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
28
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
29
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
30
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ | 0.56|± |0.0315|
31
+ | | |strict-match | 5|exact_match|↑ | 0.56|± |0.0315|
32
+
33
+ vllm (pretrained=/root/autodl-tmp/80-128,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
34
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
35
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
36
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.590|± |0.0220|
37
+ | | |strict-match | 5|exact_match|↑ |0.584|± |0.0221|
38
+
39
+ vllm (pretrained=/root/autodl-tmp/80-128,add_bos_token=true,max_model_len=3048,dtype=bfloat16), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: auto
40
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
41
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
42
+ |mmlu | 2|none | |acc |↑ |0.4339|± |0.0166|
43
+ | - humanities | 2|none | |acc |↑ |0.3949|± |0.0338|
44
+ | - other | 2|none | |acc |↑ |0.4769|± |0.0355|
45
+ | - social sciences| 2|none | |acc |↑ |0.4333|± |0.0361|
46
+ | - stem | 2|none | |acc |↑ |0.4316|± |0.0290|
47
+
48
+
49
+ vllm (pretrained=/root/autodl-tmp/80-256,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
50
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
51
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
52
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.584|± |0.0312|
53
+ | | |strict-match | 5|exact_match|↑ |0.584|± |0.0312|
54
+
55
+ vllm (pretrained=/root/autodl-tmp/80-256,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
56
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
57
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
58
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.590|± | 0.022|
59
+ | | |strict-match | 5|exact_match|↑ |0.586|± | 0.022|
60
+
61
+ vllm (pretrained=/root/autodl-tmp/80-256,add_bos_token=true,max_model_len=3048,dtype=bfloat16), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: auto
62
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
63
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
64
+ |mmlu | 2|none | |acc |↑ |0.4246|± |0.0165|
65
+ | - humanities | 2|none | |acc |↑ |0.3795|± |0.0336|
66
+ | - other | 2|none | |acc |↑ |0.4872|± |0.0356|
67
+ | - social sciences| 2|none | |acc |↑ |0.4333|± |0.0360|
68
+ | - stem | 2|none | |acc |↑ |0.4070|± |0.0282|
69
+
70
+
71
+ vllm (pretrained=/root/autodl-tmp/80-512,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
72
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
73
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
74
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.604|± | 0.031|
75
+ | | |strict-match | 5|exact_match|↑ |0.600|± | 0.031|
76
+
77
+ vllm (pretrained=/root/autodl-tmp/80-512,add_bos_token=true,max_model_len=3096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
78
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
79
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
80
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.594|± | 0.022|
81
+ | | |strict-match | 5|exact_match|↑ |0.586|± | 0.022|
82
+
83
+ vllm (pretrained=/root/autodl-tmp/80-512,add_bos_token=true,max_model_len=3048,dtype=bfloat16), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: auto
84
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
85
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
86
+ |mmlu | 2|none | |acc |↑ |0.4316|± |0.0166|
87
+ | - humanities | 2|none | |acc |↑ |0.4000|± |0.0341|
88
+ | - other | 2|none | |acc |↑ |0.4821|± |0.0355|
89
+ | - social sciences| 2|none | |acc |↑ |0.4278|± |0.0356|
90
+ | - stem | 2|none | |acc |↑ |0.4211|± |0.0289|