0
stringclasses 12
values | 1
float64 0
120k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 0.187136 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.633888 |
megatron.core.transformer.attention.forward.linear_proj
| 0.585408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.428801 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17632 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.32592 |
megatron.core.transformer.mlp.forward.activation
| 0.036896 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.106944 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175488 |
megatron.core.transformer.attention.forward.qkv
| 0.186464 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.603104 |
megatron.core.transformer.attention.forward.linear_proj
| 0.581632 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.3936 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175392 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.326848 |
megatron.core.transformer.mlp.forward.activation
| 0.037664 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.731104 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.106752 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.174944 |
megatron.core.transformer.attention.forward.qkv
| 0.185568 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002784 |
megatron.core.transformer.attention.forward.core_attention
| 23.591137 |
megatron.core.transformer.attention.forward.linear_proj
| 1.062272 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.8608 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175808 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328448 |
megatron.core.transformer.mlp.forward.activation
| 0.03776 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.729952 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.107424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175392 |
megatron.core.transformer.attention.forward.qkv
| 0.184576 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.489056 |
megatron.core.transformer.attention.forward.linear_proj
| 0.565056 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.260769 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.174944 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.32544 |
megatron.core.transformer.mlp.forward.activation
| 0.037792 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.730528 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.104736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175136 |
megatron.core.transformer.attention.forward.qkv
| 0.186144 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 23.515167 |
megatron.core.transformer.attention.forward.linear_proj
| 0.559616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.283072 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17696 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328992 |
megatron.core.transformer.mlp.forward.activation
| 0.037696 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.72848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.106432 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176864 |
megatron.core.transformer.attention.forward.qkv
| 0.186336 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002784 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.533472 |
megatron.core.transformer.attention.forward.linear_proj
| 0.531456 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.273184 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.177344 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327328 |
megatron.core.transformer.mlp.forward.activation
| 0.037824 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.735648 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.112224 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.17648 |
megatron.core.transformer.attention.forward.qkv
| 0.187392 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.535936 |
megatron.core.transformer.attention.forward.linear_proj
| 0.5296 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.2752 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.17744 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328192 |
megatron.core.transformer.mlp.forward.activation
| 0.036864 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.7288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.10512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176576 |
megatron.core.transformer.attention.forward.qkv
| 0.186688 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002752 |
megatron.core.transformer.attention.forward.core_attention
| 51,303.050781 |
megatron.core.transformer.attention.forward.linear_proj
| 7.948448 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 51,311.210938 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175936 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.323424 |
megatron.core.transformer.mlp.forward.activation
| 0.03696 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.737216 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.108416 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175904 |
megatron.core.transformer.attention.forward.qkv
| 0.186656 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.893921 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.