0
stringclasses 12
values | 1
float64 0
55.9k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 4.936736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,100.558594 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,268.267578 |
megatron.core.transformer.mlp.forward.linear_fc1
| 8.466432 |
megatron.core.transformer.mlp.forward.activation
| 613.271729 |
megatron.core.transformer.mlp.forward.linear_fc2
| 8.484992 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 631.306152 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 5.255232 |
megatron.core.transformer.attention.forward.qkv
| 7.75696 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.082784 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.104576 |
megatron.core.transformer.attention.forward.core_attention
| 30.752512 |
megatron.core.transformer.attention.forward.linear_proj
| 1.645984 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 40.569794 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.454912 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.100384 |
megatron.core.transformer.mlp.forward.activation
| 0.335968 |
megatron.core.transformer.mlp.forward.linear_fc2
| 3.828032 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 7.276832 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.455712 |
megatron.core.transformer.attention.forward.qkv
| 228.561279 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.122144 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.121728 |
megatron.core.transformer.attention.forward.core_attention
| 7,559.032227 |
megatron.core.transformer.attention.forward.linear_proj
| 98.999168 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 7,888.053711 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 253.020798 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.016224 |
megatron.core.transformer.mlp.forward.activation
| 411.065735 |
megatron.core.transformer.mlp.forward.linear_fc2
| 77.893089 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 495.89743 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.014592 |
megatron.core.transformer.attention.forward.qkv
| 0.029152 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003264 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003232 |
megatron.core.transformer.attention.forward.core_attention
| 1,687.914185 |
megatron.core.transformer.attention.forward.linear_proj
| 0.048 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,688.015747 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.012416 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.037088 |
megatron.core.transformer.mlp.forward.activation
| 0.008256 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.071616 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.12912 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.01232 |
megatron.core.transformer.attention.forward.qkv
| 213.179611 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 8,656.235352 |
megatron.core.transformer.attention.forward.linear_proj
| 6.59632 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8,878.753906 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,625.942627 |
megatron.core.transformer.mlp.forward.linear_fc1
| 9.467584 |
megatron.core.transformer.mlp.forward.activation
| 639.750549 |
megatron.core.transformer.mlp.forward.linear_fc2
| 13.471424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 663.076843 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.89552 |
megatron.core.transformer.attention.forward.qkv
| 2.804832 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 2,261.858643 |
megatron.core.transformer.attention.forward.linear_proj
| 3.068928 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,267.758057 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.891712 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.06816 |
megatron.core.transformer.mlp.forward.activation
| 0.660608 |
megatron.core.transformer.mlp.forward.linear_fc2
| 7.492288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 14.232704 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.892096 |
megatron.core.transformer.attention.forward.qkv
| 226.31424 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.11968 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.095296 |
megatron.core.transformer.attention.forward.core_attention
| 6,809.733398 |
megatron.core.transformer.attention.forward.linear_proj
| 4.079392 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 7,041.491699 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,567.616211 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.063328 |
megatron.core.transformer.mlp.forward.activation
| 582.60321 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.097376 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 585.843384 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.453984 |
megatron.core.transformer.attention.forward.qkv
| 0.504544 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.070272 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.083296 |
megatron.core.transformer.attention.forward.core_attention
| 1,899.908813 |
megatron.core.transformer.attention.forward.linear_proj
| 0.083008 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,900.917114 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.024 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.103968 |
megatron.core.transformer.mlp.forward.activation
| 0.0168 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.141568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.274624 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.023296 |
megatron.core.transformer.attention.forward.qkv
| 261.141083 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 8,608.158203 |
megatron.core.transformer.attention.forward.linear_proj
| 15.364288 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8,886.300781 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,749.772827 |
megatron.core.transformer.mlp.forward.linear_fc1
| 24.398272 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.