Activation functions

Customized activation functions for supporting various models in 🤗 Diffusers.

GELU

class diffusers.models.activations.GELU

< >

( dim_in: int dim_out: int approximate: str = 'none' )

Parameters

  • dim_in (int) — The number of channels in the input.
  • dim_out (int) — The number of channels in the output.
  • approximate (str, optional, defaults to "none") — If "tanh", use tanh approximation.

GELU activation function with tanh approximation support with approximate="tanh".

GEGLU

class diffusers.models.activations.GEGLU

< >

( dim_in: int dim_out: int )

Parameters

  • dim_in (int) — The number of channels in the input.
  • dim_out (int) — The number of channels in the output.

A variant of the gated linear unit activation function.

ApproximateGELU

class diffusers.models.activations.ApproximateGELU

< >

( dim_in: int dim_out: int )

Parameters

  • dim_in (int) — The number of channels in the input.
  • dim_out (int) — The number of channels in the output.

The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.