0
stringclasses 12
values | 1
float64 0
26.5k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 1.05952 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,001.77063 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 361.209351 |
megatron.core.transformer.mlp.forward.linear_fc1
| 4.451392 |
megatron.core.transformer.mlp.forward.activation
| 199.325821 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.696448 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 207.356537 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.234368 |
megatron.core.transformer.attention.forward.qkv
| 0.701088 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003104 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104 |
megatron.core.transformer.attention.forward.core_attention
| 19.307585 |
megatron.core.transformer.attention.forward.linear_proj
| 0.825376 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 20.859648 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.231328 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.556608 |
megatron.core.transformer.mlp.forward.activation
| 0.169984 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.903744 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.64256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.231456 |
megatron.core.transformer.attention.forward.qkv
| 198.480545 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.12144 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.108544 |
megatron.core.transformer.attention.forward.core_attention
| 5,362.491699 |
megatron.core.transformer.attention.forward.linear_proj
| 8.091328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5,571.005859 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 204.137054 |
megatron.core.transformer.mlp.forward.linear_fc1
| 8.612896 |
megatron.core.transformer.mlp.forward.activation
| 180.249573 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.731168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 191.787872 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.12272 |
megatron.core.transformer.attention.forward.qkv
| 0.608 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.0752 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.085024 |
megatron.core.transformer.attention.forward.core_attention
| 11.303648 |
megatron.core.transformer.attention.forward.linear_proj
| 0.433984 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 12.786912 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.123136 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.772608 |
megatron.core.transformer.mlp.forward.activation
| 0.087488 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.950464 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.82208 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.121344 |
megatron.core.transformer.attention.forward.qkv
| 167.737885 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.13808 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.08512 |
megatron.core.transformer.attention.forward.core_attention
| 4,867.79834 |
megatron.core.transformer.attention.forward.linear_proj
| 3.582272 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5,040.692871 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 365.269836 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.786272 |
megatron.core.transformer.mlp.forward.activation
| 607.99884 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.489312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 611.920288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.808512 |
megatron.core.transformer.attention.forward.qkv
| 0.681504 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.11568 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.109088 |
megatron.core.transformer.attention.forward.core_attention
| 1,806.52356 |
megatron.core.transformer.attention.forward.linear_proj
| 0.23536 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,808.196045 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.06464 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.386848 |
megatron.core.transformer.mlp.forward.activation
| 0.047648 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.494016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.940096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.064736 |
megatron.core.transformer.attention.forward.qkv
| 977.254028 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.146816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.098304 |
megatron.core.transformer.attention.forward.core_attention
| 4,669.521973 |
megatron.core.transformer.attention.forward.linear_proj
| 3.76528 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5,654.317383 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 443.520386 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.075584 |
megatron.core.transformer.mlp.forward.activation
| 237.695068 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.688064 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 241.513947 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.818528 |
megatron.core.transformer.attention.forward.qkv
| 1.01952 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.284736 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.11552 |
megatron.core.transformer.attention.forward.core_attention
| 2,391.431885 |
megatron.core.transformer.attention.forward.linear_proj
| 0.126848 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,393.420898 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.037792 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.197248 |
megatron.core.transformer.mlp.forward.activation
| 0.027424 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.252896 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.490112 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037408 |
megatron.core.transformer.attention.forward.qkv
| 201.529694 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.11344 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.087232 |
megatron.core.transformer.attention.forward.core_attention
| 1,884.968994 |
megatron.core.transformer.attention.forward.linear_proj
| 1.143936 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,088.739502 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 289.283569 |
megatron.core.transformer.mlp.forward.linear_fc1
| 7.671904 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.