0
stringclasses 12
values | 1
float64 0
120k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.555744 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.658785 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176928 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.32688 |
megatron.core.transformer.mlp.forward.activation
| 0.037696 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.730336 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.10576 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176032 |
megatron.core.transformer.attention.forward.qkv
| 0.186848 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002752 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.894432 |
megatron.core.transformer.attention.forward.linear_proj
| 1.189632 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 25.293247 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176096 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325728 |
megatron.core.transformer.mlp.forward.activation
| 0.037248 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.733376 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.107168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176704 |
megatron.core.transformer.attention.forward.qkv
| 0.185408 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 23.736481 |
megatron.core.transformer.attention.forward.linear_proj
| 0.614976 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.559551 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176032 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.3264 |
megatron.core.transformer.mlp.forward.activation
| 0.037184 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.729536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.104192 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176544 |
megatron.core.transformer.attention.forward.qkv
| 0.1864 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 23.546721 |
megatron.core.transformer.attention.forward.linear_proj
| 0.572736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.328672 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176704 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327232 |
megatron.core.transformer.mlp.forward.activation
| 0.037632 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.725568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.101248 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176256 |
megatron.core.transformer.attention.forward.qkv
| 0.184928 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.555073 |
megatron.core.transformer.attention.forward.linear_proj
| 0.562592 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.325249 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176608 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.324352 |
megatron.core.transformer.mlp.forward.activation
| 0.03744 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.737184 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.109824 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175936 |
megatron.core.transformer.attention.forward.qkv
| 0.185376 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.554111 |
megatron.core.transformer.attention.forward.linear_proj
| 0.573344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.335712 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325856 |
megatron.core.transformer.mlp.forward.activation
| 0.037408 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.740032 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.114272 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176672 |
megatron.core.transformer.attention.forward.qkv
| 0.186624 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002784 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.542303 |
megatron.core.transformer.attention.forward.linear_proj
| 0.58 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.331455 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176448 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328448 |
megatron.core.transformer.mlp.forward.activation
| 0.037824 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.72672 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.104224 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176448 |
megatron.core.transformer.attention.forward.qkv
| 0.186432 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.553823 |
megatron.core.transformer.attention.forward.linear_proj
| 0.563136 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.326784 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17728 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.326144 |
megatron.core.transformer.mlp.forward.activation
| 0.03728 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.735168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.1096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176448 |
megatron.core.transformer.attention.forward.qkv
| 0.185376 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 23.546753 |
megatron.core.transformer.attention.forward.linear_proj
| 0.587008 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.341633 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.1768 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327808 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.