smcleish commited on
Commit
764b073
·
verified ·
1 Parent(s): 34cade8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +78 -39
README.md CHANGED
@@ -1,39 +1,78 @@
1
- ---
2
- dataset_info:
3
- features:
4
- - name: depth
5
- dtype: int64
6
- - name: width
7
- dtype: int64
8
- - name: tokens
9
- dtype: int64
10
- - name: FLOPs_per_token
11
- dtype: float64
12
- - name: FLOPs
13
- dtype: float64
14
- - name: params
15
- dtype: float64
16
- - name: params_with_embeds
17
- dtype: float64
18
- - name: FLOPs_6N
19
- dtype: float64
20
- - name: params_pred_loss
21
- dtype: float64
22
- - name: wd_ratio
23
- dtype: float64
24
- - name: wd_pred_loss
25
- dtype: float64
26
- - name: bucket
27
- dtype: string
28
- splits:
29
- - name: train
30
- num_bytes: 1772
31
- num_examples: 13
32
- download_size: 6825
33
- dataset_size: 1772
34
- configs:
35
- - config_name: default
36
- data_files:
37
- - split: train
38
- path: mins_1e-3/mins_lr_ablation_hot_width_depth_params_relaxed_params/train-*
39
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ dataset_info:
3
+ features:
4
+ - name: depth
5
+ dtype: int64
6
+ - name: width
7
+ dtype: int64
8
+ - name: tokens
9
+ dtype: int64
10
+ - name: FLOPs_per_token
11
+ dtype: float64
12
+ - name: FLOPs
13
+ dtype: float64
14
+ - name: params
15
+ dtype: float64
16
+ - name: params_with_embeds
17
+ dtype: float64
18
+ - name: FLOPs_6N
19
+ dtype: float64
20
+ - name: params_pred_loss
21
+ dtype: float64
22
+ - name: wd_ratio
23
+ dtype: float64
24
+ - name: wd_pred_loss
25
+ dtype: float64
26
+ - name: bucket
27
+ dtype: string
28
+ splits:
29
+ - name: train
30
+ num_bytes: 1772
31
+ num_examples: 13
32
+ download_size: 6825
33
+ dataset_size: 1772
34
+ configs:
35
+ - config_name: default
36
+ data_files:
37
+ - split: train
38
+ path: mins_1e-3/mins_lr_ablation_hot_width_depth_params_relaxed_params/train-*
39
+ license: mit
40
+ ---
41
+ This dataset is my cache for the [scaling-laws](https://github.com/mcleish7/gemstone-scaling-laws) related to the [gemstone models](https://huggingface.co/collections/tomg-group-umd/gemstone-models-679408ee3f19f1d4d00e8b10).
42
+
43
+ In `data_cache` is the approach 3 data cache with the mins for `delta=1e-4`, the mins for `delta=1e-3` are in `mins_1e-3`.
44
+
45
+ This is the code I used to upload it:
46
+ ```
47
+ import pandas as pd
48
+ from datasets import Dataset
49
+ import os
50
+ import gc
51
+
52
+
53
+ def get_data_dict(path):
54
+ contents = os.listdir(path)
55
+
56
+ ds_store = {}
57
+ for i, file in enumerate(contents):
58
+ gc.collect()
59
+ df = pd.read_parquet(f"{path}{file}")
60
+ for col in df.columns:
61
+ if pd.api.types.is_interval_dtype(df[col]):
62
+ df[col] = df[col].astype(str)
63
+
64
+ hf_dataset = Dataset.from_pandas(df)
65
+ ds_store[file.replace(".parquet", "")] = hf_dataset
66
+ hf_dataset.push_to_hub(
67
+ "smcleish/scaling-laws-cache",
68
+ private=True,
69
+ data_dir=path.split("/")[1] + "/" + file.replace(".parquet", ""),
70
+ )
71
+ gc.collect()
72
+
73
+
74
+ ds_1 = get_data_dict("plotters/data_cache/")
75
+ ds_2 = get_data_dict("plotters/mins_1e-3/")
76
+ ```
77
+ To download it do the oppostite of this. The cache is very large, so maybe target specific files you would like. The approach 3 code is expecting pandas `.parquet` files.
78
+ Please open a discussion with any questions as this is currently very experimental.