Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError Exception: ArrowInvalid Message: JSON parse error: Column() changed from object to array in row 0 Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 160, in _generate_tables df = pandas_read_json(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json return pd.read_json(path_or_buf, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 815, in read_json return json_reader.read() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1025, in read obj = self._get_object_parser(self.data) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1051, in _get_object_parser obj = FrameParser(json, **kwargs).parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1187, in parse self._parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1403, in _parse ujson_loads(json, precise_float=self.precise_float), dtype=None ValueError: Trailing data During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1855, in _prepare_split_single for _, table in generator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 163, in _generate_tables raise e File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 137, in _generate_tables pa_table = paj.read_json( File "pyarrow/_json.pyx", line 308, in pyarrow._json.read_json File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: JSON parse error: Column() changed from object to array in row 0 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1436, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1053, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 925, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1001, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1742, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1898, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
0
string | 1
float64 |
---|---|
megatron.core.transformer.attention.forward.qkv
| 235.090561 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003104 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003136 |
megatron.core.transformer.attention.forward.core_attention
| 836.380249 |
megatron.core.transformer.attention.forward.linear_proj
| 1.484064 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,074.752808 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,103.562866 |
megatron.core.transformer.mlp.forward.linear_fc1
| 8.837568 |
megatron.core.transformer.mlp.forward.activation
| 469.043427 |
megatron.core.transformer.mlp.forward.linear_fc2
| 6.065408 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 485.610168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452608 |
megatron.core.transformer.attention.forward.qkv
| 2.578528 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.0032 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 6.068768 |
megatron.core.transformer.attention.forward.linear_proj
| 1.432608 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10.107264 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.725888 |
megatron.core.transformer.mlp.forward.activation
| 0.6752 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.672992 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.08784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.460928 |
megatron.core.transformer.attention.forward.qkv
| 2.585184 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003104 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 6.164992 |
megatron.core.transformer.attention.forward.linear_proj
| 1.437152 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10.214112 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.460704 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.723744 |
megatron.core.transformer.mlp.forward.activation
| 0.675328 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.692224 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.105696 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.460992 |
megatron.core.transformer.attention.forward.qkv
| 2.593152 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003136 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003136 |
megatron.core.transformer.attention.forward.core_attention
| 6.173216 |
megatron.core.transformer.attention.forward.linear_proj
| 1.43792 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10.231136 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.46224 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.733536 |
megatron.core.transformer.mlp.forward.activation
| 0.673728 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.688384 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.109696 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.473184 |
megatron.core.transformer.attention.forward.qkv
| 2.618784 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003136 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 6.668256 |
megatron.core.transformer.attention.forward.linear_proj
| 1.566112 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10.8792 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.556992 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.146048 |
megatron.core.transformer.mlp.forward.activation
| 0.800864 |
megatron.core.transformer.mlp.forward.linear_fc2
| 6.009824 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.969952 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.556416 |
megatron.core.transformer.attention.forward.qkv
| 2.75232 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003136 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 7.050624 |
megatron.core.transformer.attention.forward.linear_proj
| 1.555424 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.384224 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.552256 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.082848 |
megatron.core.transformer.mlp.forward.activation
| 0.794048 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.820096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.709856 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.53696 |
megatron.core.transformer.attention.forward.qkv
| 2.678496 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003136 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 6.881536 |
megatron.core.transformer.attention.forward.linear_proj
| 1.504448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.089344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.536864 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.90688 |
megatron.core.transformer.mlp.forward.activation
| 0.77472 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.771968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.466016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.536768 |
megatron.core.transformer.attention.forward.qkv
| 2.654976 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003168 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003168 |
megatron.core.transformer.attention.forward.core_attention
| 6.728128 |
megatron.core.transformer.attention.forward.linear_proj
| 1.505856 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10.91408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.55184 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.082816 |
megatron.core.transformer.mlp.forward.activation
| 0.79424 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.948224 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.837376 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.551712 |
megatron.core.transformer.attention.forward.qkv
| 2.751488 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 7.142624 |
End of preview.
No dataset card yet
- Downloads last month
- 178