Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError Exception: ArrowInvalid Message: JSON parse error: Column() changed from object to array in row 0 Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 160, in _generate_tables df = pandas_read_json(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json return pd.read_json(path_or_buf, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 815, in read_json return json_reader.read() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1025, in read obj = self._get_object_parser(self.data) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1051, in _get_object_parser obj = FrameParser(json, **kwargs).parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1187, in parse self._parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1403, in _parse ujson_loads(json, precise_float=self.precise_float), dtype=None ValueError: Trailing data During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1855, in _prepare_split_single for _, table in generator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 163, in _generate_tables raise e File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 137, in _generate_tables pa_table = paj.read_json( File "pyarrow/_json.pyx", line 308, in pyarrow._json.read_json File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: JSON parse error: Column() changed from object to array in row 0 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1436, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1053, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 925, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1001, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1742, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1898, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
0
string | 1
float64 |
---|---|
megatron.core.transformer.attention.forward.qkv
| 197.073761 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.1096 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.087072 |
megatron.core.transformer.attention.forward.core_attention
| 860.149475 |
megatron.core.transformer.attention.forward.linear_proj
| 1.501952 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,060.835693 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,103.405151 |
megatron.core.transformer.mlp.forward.linear_fc1
| 15.728672 |
megatron.core.transformer.mlp.forward.activation
| 474.251373 |
megatron.core.transformer.mlp.forward.linear_fc2
| 10.712512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 502.245209 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.45184 |
megatron.core.transformer.attention.forward.qkv
| 7.838624 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.07984 |
megatron.core.transformer.attention.forward.core_attention
| 16.827616 |
megatron.core.transformer.attention.forward.linear_proj
| 1.4552 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 26.358528 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452288 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.807456 |
megatron.core.transformer.mlp.forward.activation
| 0.662016 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.723648 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.206848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451584 |
megatron.core.transformer.attention.forward.qkv
| 2.606688 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 6.24656 |
megatron.core.transformer.attention.forward.linear_proj
| 1.493664 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10.373056 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.522976 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.837856 |
megatron.core.transformer.mlp.forward.activation
| 0.751808 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.739264 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.342464 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.519968 |
megatron.core.transformer.attention.forward.qkv
| 2.660864 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 6.968032 |
megatron.core.transformer.attention.forward.linear_proj
| 1.538496 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.192064 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.543008 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.032384 |
megatron.core.transformer.mlp.forward.activation
| 0.78288 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.911232 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.74048 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.557792 |
megatron.core.transformer.attention.forward.qkv
| 2.783616 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 7.114752 |
megatron.core.transformer.attention.forward.linear_proj
| 1.567936 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.491584 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.557856 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.152032 |
megatron.core.transformer.mlp.forward.activation
| 0.804224 |
megatron.core.transformer.mlp.forward.linear_fc2
| 6.015296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.984736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.55696 |
megatron.core.transformer.attention.forward.qkv
| 2.795712 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 7.113344 |
megatron.core.transformer.attention.forward.linear_proj
| 1.570464 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.504032 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.556896 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.144864 |
megatron.core.transformer.mlp.forward.activation
| 0.801184 |
megatron.core.transformer.mlp.forward.linear_fc2
| 6.009728 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.967648 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.557248 |
megatron.core.transformer.attention.forward.qkv
| 2.777248 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 7.061664 |
megatron.core.transformer.attention.forward.linear_proj
| 1.539584 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.4024 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.54208 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.023904 |
megatron.core.transformer.mlp.forward.activation
| 0.781344 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.890464 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.707456 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.541952 |
megatron.core.transformer.attention.forward.qkv
| 2.729856 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 7.215072 |
megatron.core.transformer.attention.forward.linear_proj
| 1.605056 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.573824 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.567872 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.269952 |
megatron.core.transformer.mlp.forward.activation
| 0.815392 |
megatron.core.transformer.mlp.forward.linear_fc2
| 6.065568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 13.162688 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.556928 |
megatron.core.transformer.attention.forward.qkv
| 2.782912 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00272 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 7.097888 |
End of preview.
No dataset card yet
- Downloads last month
- 159