0
stringclasses 12
values | 1
float64 0
120k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.187232 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.584641 |
megatron.core.transformer.attention.forward.linear_proj
| 0.604288 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.398272 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175616 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325376 |
megatron.core.transformer.mlp.forward.activation
| 0.03728 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.740096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.114176 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175488 |
megatron.core.transformer.attention.forward.qkv
| 0.186048 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.557344 |
megatron.core.transformer.attention.forward.linear_proj
| 0.538144 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.30336 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17568 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.326496 |
megatron.core.transformer.mlp.forward.activation
| 0.037504 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.729792 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.104864 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175136 |
megatron.core.transformer.attention.forward.qkv
| 0.186976 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.530592 |
megatron.core.transformer.attention.forward.linear_proj
| 0.592512 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.331873 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175744 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.326464 |
megatron.core.transformer.mlp.forward.activation
| 0.037056 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.733184 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.108 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176 |
megatron.core.transformer.attention.forward.qkv
| 0.186464 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002752 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 23.517729 |
megatron.core.transformer.attention.forward.linear_proj
| 0.555968 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.282207 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17552 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325408 |
megatron.core.transformer.mlp.forward.activation
| 0.036896 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.108 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.18192 |
megatron.core.transformer.attention.forward.qkv
| 0.187552 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003136 |
megatron.core.transformer.attention.forward.core_attention
| 23.519712 |
megatron.core.transformer.attention.forward.linear_proj
| 0.528032 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.258305 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176992 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327552 |
megatron.core.transformer.mlp.forward.activation
| 0.03824 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.72944 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.106848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176256 |
megatron.core.transformer.attention.forward.qkv
| 0.18912 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002784 |
megatron.core.transformer.attention.forward.core_attention
| 51,554.652344 |
megatron.core.transformer.attention.forward.linear_proj
| 23.022911 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 51,577.886719 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.323136 |
megatron.core.transformer.mlp.forward.activation
| 0.036704 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73136 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.102304 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176224 |
megatron.core.transformer.attention.forward.qkv
| 0.185184 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.884192 |
megatron.core.transformer.attention.forward.linear_proj
| 2.107488 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 26.19952 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175936 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.324224 |
megatron.core.transformer.mlp.forward.activation
| 0.037344 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73104 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.104192 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176768 |
megatron.core.transformer.attention.forward.qkv
| 0.183808 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.394239 |
megatron.core.transformer.attention.forward.linear_proj
| 0.562496 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.163328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17696 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325376 |
megatron.core.transformer.mlp.forward.activation
| 0.037312 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.732928 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.10672 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176736 |
megatron.core.transformer.attention.forward.qkv
| 0.184896 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002784 |
megatron.core.transformer.attention.forward.core_attention
| 23.45504 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.