0
stringclasses 12
values | 1
float64 0
120k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.187808 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 23.421408 |
megatron.core.transformer.attention.forward.linear_proj
| 1.029632 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.660801 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.177376 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328352 |
megatron.core.transformer.mlp.forward.activation
| 0.037472 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.737312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.11456 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176928 |
megatron.core.transformer.attention.forward.qkv
| 0.188768 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 23.344929 |
megatron.core.transformer.attention.forward.linear_proj
| 0.567968 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.123903 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176704 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.326336 |
megatron.core.transformer.mlp.forward.activation
| 0.037312 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.110432 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176544 |
megatron.core.transformer.attention.forward.qkv
| 0.186304 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002752 |
megatron.core.transformer.attention.forward.core_attention
| 23.337761 |
megatron.core.transformer.attention.forward.linear_proj
| 0.552192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.098175 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17632 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327168 |
megatron.core.transformer.mlp.forward.activation
| 0.037696 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.732384 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.10832 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175616 |
megatron.core.transformer.attention.forward.qkv
| 0.186528 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.332001 |
megatron.core.transformer.attention.forward.linear_proj
| 0.580896 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.121344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175968 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327904 |
megatron.core.transformer.mlp.forward.activation
| 0.037376 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.725184 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.101952 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175872 |
megatron.core.transformer.attention.forward.qkv
| 0.185728 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.328768 |
megatron.core.transformer.attention.forward.linear_proj
| 1.117984 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.654465 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175584 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325984 |
megatron.core.transformer.mlp.forward.activation
| 0.037024 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.74112 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.115584 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175808 |
megatron.core.transformer.attention.forward.qkv
| 0.18608 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 23.292608 |
megatron.core.transformer.attention.forward.linear_proj
| 0.57648 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.076992 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175264 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327392 |
megatron.core.transformer.mlp.forward.activation
| 0.03712 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.732896 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.1088 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175392 |
megatron.core.transformer.attention.forward.qkv
| 0.185856 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 23.302719 |
megatron.core.transformer.attention.forward.linear_proj
| 0.598208 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.108608 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176672 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327936 |
megatron.core.transformer.mlp.forward.activation
| 0.037728 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.730752 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.10768 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176384 |
megatron.core.transformer.attention.forward.qkv
| 0.188672 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.345345 |
megatron.core.transformer.attention.forward.linear_proj
| 0.527776 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.084448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17648 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.329568 |
megatron.core.transformer.mlp.forward.activation
| 0.037344 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.113792 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.17744 |
megatron.core.transformer.attention.forward.qkv
| 0.188704 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002784 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 50,720.035156 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.