0
stringclasses 12
values | 1
float64 0
9.97k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 10.023456 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 7,788.802246 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 618.533325 |
megatron.core.transformer.mlp.forward.linear_fc1
| 14.023296 |
megatron.core.transformer.mlp.forward.activation
| 191.194534 |
megatron.core.transformer.mlp.forward.linear_fc2
| 14.884736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 220.11705 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.7744 |
megatron.core.transformer.attention.forward.qkv
| 5.55376 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 2,273.336182 |
megatron.core.transformer.attention.forward.linear_proj
| 5.860608 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,284.775635 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.772768 |
megatron.core.transformer.mlp.forward.linear_fc1
| 12.058848 |
megatron.core.transformer.mlp.forward.activation
| 1.31728 |
megatron.core.transformer.mlp.forward.linear_fc2
| 14.856416 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 28.244961 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.774016 |
megatron.core.transformer.attention.forward.qkv
| 260.614014 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.120384 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.096352 |
megatron.core.transformer.attention.forward.core_attention
| 6,195.922852 |
megatron.core.transformer.attention.forward.linear_proj
| 3.6712 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6,461.907227 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 446.85434 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.903328 |
megatron.core.transformer.mlp.forward.activation
| 635.178162 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.105408 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 637.955811 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.460576 |
megatron.core.transformer.attention.forward.qkv
| 0.644576 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.078688 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.087392 |
megatron.core.transformer.attention.forward.core_attention
| 2,364.176758 |
megatron.core.transformer.attention.forward.linear_proj
| 0.560288 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 2,365.848633 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.037888 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.19808 |
megatron.core.transformer.mlp.forward.activation
| 0.027232 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.254784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.492672 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037024 |
megatron.core.transformer.attention.forward.qkv
| 203.440475 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.039136 |
megatron.core.transformer.attention.forward.core_attention
| 7,832.186035 |
megatron.core.transformer.attention.forward.linear_proj
| 112.108574 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8,148.770508 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 338.625244 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.408992 |
megatron.core.transformer.mlp.forward.activation
| 229.173401 |
megatron.core.transformer.mlp.forward.linear_fc2
| 189.987106 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 426.305756 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452256 |
megatron.core.transformer.attention.forward.qkv
| 1.40304 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 1,305.200195 |
megatron.core.transformer.attention.forward.linear_proj
| 1.794976 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,308.422241 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452096 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.063264 |
megatron.core.transformer.mlp.forward.activation
| 0.33344 |
megatron.core.transformer.mlp.forward.linear_fc2
| 3.785696 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 7.19424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452896 |
megatron.core.transformer.attention.forward.qkv
| 260.679291 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.142304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.101408 |
megatron.core.transformer.attention.forward.core_attention
| 7,621.394531 |
megatron.core.transformer.attention.forward.linear_proj
| 285.35025 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 8,171.305664 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 19.858368 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.081088 |
megatron.core.transformer.mlp.forward.activation
| 283.546021 |
megatron.core.transformer.mlp.forward.linear_fc2
| 191.603165 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 477.161285 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.038944 |
megatron.core.transformer.attention.forward.qkv
| 0.059328 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 0.49952 |
megatron.core.transformer.attention.forward.linear_proj
| 3.957472 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.540096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.036832 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.108608 |
megatron.core.transformer.mlp.forward.activation
| 0.018016 |
megatron.core.transformer.mlp.forward.linear_fc2
| 3.058048 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.1968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037664 |
megatron.core.transformer.attention.forward.qkv
| 274.513458 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.108896 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.097312 |
megatron.core.transformer.attention.forward.core_attention
| 7,282.393066 |
megatron.core.transformer.attention.forward.linear_proj
| 412.493652 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 7,970.705078 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 265.70871 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.584864 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.