0
stringclasses 12
values | 1
float64 0
9.97k
|
---|---|
megatron.core.transformer.mlp.forward.activation
| 723.698669 |
megatron.core.transformer.mlp.forward.linear_fc2
| 334.342438 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1,060.637329 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.124896 |
megatron.core.transformer.attention.forward.qkv
| 0.204192 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 2.692672 |
megatron.core.transformer.attention.forward.linear_proj
| 3.492096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.412704 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121888 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.403232 |
megatron.core.transformer.mlp.forward.activation
| 0.047744 |
megatron.core.transformer.mlp.forward.linear_fc2
| 12.934752 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 13.397536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.121408 |
megatron.core.transformer.attention.forward.qkv
| 236.917953 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.111616 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.10448 |
megatron.core.transformer.attention.forward.core_attention
| 7,554.841309 |
megatron.core.transformer.attention.forward.linear_proj
| 520.17218 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8,313.357422 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 343.596069 |
megatron.core.transformer.mlp.forward.linear_fc1
| 2.307712 |
megatron.core.transformer.mlp.forward.activation
| 351.280487 |
megatron.core.transformer.mlp.forward.linear_fc2
| 154.350494 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 508.503723 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.457536 |
megatron.core.transformer.attention.forward.qkv
| 0.779776 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003072 |
megatron.core.transformer.attention.forward.core_attention
| 19.948416 |
megatron.core.transformer.attention.forward.linear_proj
| 2.295584 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 23.047615 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.454752 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.655424 |
megatron.core.transformer.mlp.forward.activation
| 0.169376 |
megatron.core.transformer.mlp.forward.linear_fc2
| 4.604 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 6.440576 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.455232 |
megatron.core.transformer.attention.forward.qkv
| 295.874176 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 7,889.149414 |
megatron.core.transformer.attention.forward.linear_proj
| 13.5712 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8,204.09668 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 467.064117 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.845184 |
megatron.core.transformer.mlp.forward.activation
| 260.533386 |
megatron.core.transformer.mlp.forward.linear_fc2
| 18.527712 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 285.920166 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.791776 |
megatron.core.transformer.attention.forward.qkv
| 3.138016 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 188.628006 |
megatron.core.transformer.attention.forward.linear_proj
| 6.653088 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 198.443741 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.787136 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.4536 |
megatron.core.transformer.mlp.forward.activation
| 0.659264 |
megatron.core.transformer.mlp.forward.linear_fc2
| 10.476352 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 17.60096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.7968 |
megatron.core.transformer.attention.forward.qkv
| 271.862793 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.115456 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.093184 |
megatron.core.transformer.attention.forward.core_attention
| 5,140.265625 |
megatron.core.transformer.attention.forward.linear_proj
| 4.419168 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5,418.236328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 318.652832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 2.624192 |
megatron.core.transformer.mlp.forward.activation
| 293.888489 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.755104 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 298.581329 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.466496 |
megatron.core.transformer.attention.forward.qkv
| 0.514144 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.078752 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.091264 |
megatron.core.transformer.attention.forward.core_attention
| 1,425.202759 |
megatron.core.transformer.attention.forward.linear_proj
| 0.516192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,426.713379 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.022976 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.057056 |
megatron.core.transformer.mlp.forward.activation
| 0.01168 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.115168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.196256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.02304 |
megatron.core.transformer.attention.forward.qkv
| 344.760101 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.116576 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.096384 |
megatron.core.transformer.attention.forward.core_attention
| 5,359.247559 |
megatron.core.transformer.attention.forward.linear_proj
| 5.330976 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5,710.852539 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 406.839569 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.398592 |
megatron.core.transformer.mlp.forward.activation
| 583.384949 |
megatron.core.transformer.mlp.forward.linear_fc2
| 6.941408 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 592.393555 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.06592 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.