Normalization layers

Customized normalization layers for supporting various models in 🤗 Diffusers.

AdaLayerNorm

class diffusers.models.normalization.AdaLayerNorm

< >

( embedding_dim: int num_embeddings: int )

Parameters

  • embedding_dim (int) — The size of each embedding vector.
  • num_embeddings (int) — The size of the embeddings dictionary.

Norm layer modified to incorporate timestep embeddings.

AdaLayerNormZero

class diffusers.models.normalization.AdaLayerNormZero

< >

( embedding_dim: int num_embeddings: Optional = None )

Parameters

  • embedding_dim (int) — The size of each embedding vector.
  • num_embeddings (int) — The size of the embeddings dictionary.

Norm layer adaptive layer norm zero (adaLN-Zero).

AdaLayerNormSingle

class diffusers.models.normalization.AdaLayerNormSingle

< >

( embedding_dim: int use_additional_conditions: bool = False )

Parameters

  • embedding_dim (int) — The size of each embedding vector.
  • use_additional_conditions (bool) — To use additional conditions for normalization or not.

Norm layer adaptive layer norm single (adaLN-single).

As proposed in PixArt-Alpha (see: https://arxiv.org/abs/2310.00426; Section 2.3).

AdaGroupNorm

class diffusers.models.normalization.AdaGroupNorm

< >

( embedding_dim: int out_dim: int num_groups: int act_fn: Optional = None eps: float = 1e-05 )

Parameters

  • embedding_dim (int) — The size of each embedding vector.
  • num_embeddings (int) — The size of the embeddings dictionary.
  • num_groups (int) — The number of groups to separate the channels into.
  • act_fn (str, optional, defaults to None) — The activation function to use.
  • eps (float, optional, defaults to 1e-5) — The epsilon value to use for numerical stability.

GroupNorm layer modified to incorporate timestep embeddings.

< > Update on GitHub