0
stringclasses 12
values | 1
float64 0
55.9k
|
|---|---|
megatron.core.transformer.attention.forward.linear_proj
| 12.27344
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 325.768982
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 3.53136
|
megatron.core.transformer.mlp.forward.linear_fc1
| 6.730592
|
megatron.core.transformer.mlp.forward.activation
| 0.665312
|
megatron.core.transformer.mlp.forward.linear_fc2
| 14.188576
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 21.596992
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 3.559072
|
megatron.core.transformer.attention.forward.qkv
| 0.04176
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912
|
megatron.core.transformer.attention.forward.core_attention
| 2.86912
|
megatron.core.transformer.attention.forward.linear_proj
| 2.365152
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 5.299264
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.03696
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.060224
|
megatron.core.transformer.mlp.forward.activation
| 0.011104
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.33344
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.41648
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.03728
|
megatron.core.transformer.attention.forward.qkv
| 0.04288
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 3.769504
|
megatron.core.transformer.attention.forward.linear_proj
| 2.56976
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.405184
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.036928
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.062944
|
megatron.core.transformer.mlp.forward.activation
| 0.011264
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.172608
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.258112
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037088
|
megatron.core.transformer.attention.forward.qkv
| 0.128384
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 2.744288
|
megatron.core.transformer.attention.forward.linear_proj
| 1.714336
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.610816
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121568
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.222656
|
megatron.core.transformer.mlp.forward.activation
| 0.027488
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.812288
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.074624
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.12208
|
megatron.core.transformer.attention.forward.qkv
| 0.13184
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 2.768768
|
megatron.core.transformer.attention.forward.linear_proj
| 2.016544
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4.941216
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.12144
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.223488
|
megatron.core.transformer.mlp.forward.activation
| 0.028128
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.516448
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.780544
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.1208
|
megatron.core.transformer.attention.forward.qkv
| 0.467328
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 66.49382
|
megatron.core.transformer.attention.forward.linear_proj
| 1.460288
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 68.445564
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45136
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.849216
|
megatron.core.transformer.mlp.forward.activation
| 0.088576
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.83504
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.784608
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451616
|
megatron.core.transformer.attention.forward.qkv
| 0.472704
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 20.636608
|
megatron.core.transformer.attention.forward.linear_proj
| 1.331872
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 22.464993
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451232
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.865056
|
megatron.core.transformer.mlp.forward.activation
| 0.088672
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.825536
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.791008
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.450304
|
megatron.core.transformer.attention.forward.qkv
| 1.89904
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003136
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104
|
megatron.core.transformer.attention.forward.core_attention
| 160.377151
|
megatron.core.transformer.attention.forward.linear_proj
| 5.016032
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 167.316574
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.776064
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.59904
|
megatron.core.transformer.mlp.forward.activation
| 0.33632
|
megatron.core.transformer.mlp.forward.linear_fc2
| 7.013248
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 10.960832
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 1.773504
|
megatron.core.transformer.attention.forward.qkv
| 1.89392
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003104
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003168
|
megatron.core.transformer.attention.forward.core_attention
| 160.025604
|
megatron.core.transformer.attention.forward.linear_proj
| 5.5
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 167.444672
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1.772512
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.595168
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.