0
stringclasses 12
values | 1
float64 0
2.17k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.464864 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 0.591648 |
megatron.core.transformer.attention.forward.linear_proj
| 1.280256 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.360224 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.189248 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.847104 |
megatron.core.transformer.mlp.forward.activation
| 0.08704 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.841344 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.787456 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.190464 |
megatron.core.transformer.attention.forward.qkv
| 0.466784 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 0.589632 |
megatron.core.transformer.attention.forward.linear_proj
| 1.275808 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.355776 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.190048 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.849568 |
megatron.core.transformer.mlp.forward.activation
| 0.087232 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.834496 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.783616 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.189088 |
megatron.core.transformer.attention.forward.qkv
| 0.466528 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003168 |
megatron.core.transformer.attention.forward.core_attention
| 0.591648 |
megatron.core.transformer.attention.forward.linear_proj
| 1.284096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.366208 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.191232 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.845248 |
megatron.core.transformer.mlp.forward.activation
| 0.087584 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.838816 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.783328 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.188384 |
megatron.core.transformer.attention.forward.qkv
| 0.462016 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 0.589664 |
megatron.core.transformer.attention.forward.linear_proj
| 1.283232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.358528 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.190688 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.848256 |
megatron.core.transformer.mlp.forward.activation
| 0.086848 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.833024 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.779968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.190144 |
megatron.core.transformer.attention.forward.qkv
| 0.461696 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 0.59008 |
megatron.core.transformer.attention.forward.linear_proj
| 1.289376 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.365472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.189312 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.846368 |
megatron.core.transformer.mlp.forward.activation
| 0.087776 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.836576 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.78288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.19056 |
megatron.core.transformer.attention.forward.qkv
| 0.466912 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 0.587328 |
megatron.core.transformer.attention.forward.linear_proj
| 1.28512 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.36352 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.190688 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.84384 |
megatron.core.transformer.mlp.forward.activation
| 0.087456 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.83872 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.781632 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.190272 |
megatron.core.transformer.attention.forward.qkv
| 0.464832 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 0.590784 |
megatron.core.transformer.attention.forward.linear_proj
| 1.284448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.363904 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.1904 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.841344 |
megatron.core.transformer.mlp.forward.activation
| 0.08752 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.860512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.80112 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.190432 |
megatron.core.transformer.attention.forward.qkv
| 0.465344 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 0.589888 |
megatron.core.transformer.attention.forward.linear_proj
| 1.282464 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2.361536 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.189824 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.84288 |
megatron.core.transformer.mlp.forward.activation
| 0.087616 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.841216 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.78368 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.189312 |
megatron.core.transformer.attention.forward.qkv
| 0.47232 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 5.013888 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.