0
stringclasses 12
values | 1
float64 0
26.5k
|
---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.426272 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 43.310783 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121696 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.791776 |
megatron.core.transformer.mlp.forward.activation
| 0.088096 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.96992 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.862112 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.12192 |
megatron.core.transformer.attention.forward.qkv
| 1.572544 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008 |
megatron.core.transformer.attention.forward.core_attention
| 146.612762 |
megatron.core.transformer.attention.forward.linear_proj
| 3.129568 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 151.339111 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.896384 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.194272 |
megatron.core.transformer.mlp.forward.activation
| 0.333088 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.246784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 8.786528 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.89376 |
megatron.core.transformer.attention.forward.qkv
| 1.580288 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.0032 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 146.665283 |
megatron.core.transformer.attention.forward.linear_proj
| 3.379168 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 151.649094 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.896544 |
megatron.core.transformer.mlp.forward.linear_fc1
| 3.235584 |
megatron.core.transformer.mlp.forward.activation
| 0.336064 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.32304 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 8.907008 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.897472 |
megatron.core.transformer.attention.forward.qkv
| 0.791616 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 10,212.026367 |
megatron.core.transformer.attention.forward.linear_proj
| 1.595616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 10,214.436523 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451712 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.658368 |
megatron.core.transformer.mlp.forward.activation
| 0.170112 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.678016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 4.518048 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452064 |
megatron.core.transformer.attention.forward.qkv
| 0.78704 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 82.588036 |
megatron.core.transformer.attention.forward.linear_proj
| 1.648768 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 85.046944 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451296 |
megatron.core.transformer.mlp.forward.linear_fc1
| 1.660416 |
megatron.core.transformer.mlp.forward.activation
| 0.169888 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.683424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 4.5256 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451776 |
megatron.core.transformer.attention.forward.qkv
| 0.39856 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 687.190735 |
megatron.core.transformer.attention.forward.linear_proj
| 0.901024 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 688.51355 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.233152 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.803744 |
megatron.core.transformer.mlp.forward.activation
| 0.08688 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.340544 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.24304 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.23264 |
megatron.core.transformer.attention.forward.qkv
| 0.393312 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00336 |
megatron.core.transformer.attention.forward.core_attention
| 40.882366 |
megatron.core.transformer.attention.forward.linear_proj
| 0.908992 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 42.208481 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.232576 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.813568 |
megatron.core.transformer.mlp.forward.activation
| 0.087808 |
megatron.core.transformer.mlp.forward.linear_fc2
| 1.346752 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.259552 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.232832 |
megatron.core.transformer.attention.forward.qkv
| 0.20576 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848 |
megatron.core.transformer.attention.forward.core_attention
| 348.464905 |
megatron.core.transformer.attention.forward.linear_proj
| 0.419392 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 349.11261 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121952 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.392576 |
megatron.core.transformer.mlp.forward.activation
| 0.04752 |
megatron.core.transformer.mlp.forward.linear_fc2
| 0.673536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.125312 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.120864 |
megatron.core.transformer.attention.forward.qkv
| 0.20096 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816 |
megatron.core.transformer.attention.forward.core_attention
| 23.302273 |
megatron.core.transformer.attention.forward.linear_proj
| 0.450912 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 23.976736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121536 |
megatron.core.transformer.mlp.forward.linear_fc1
| 0.395456 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.