Diffusers documentation
Activation functions
You are viewing v0.32.2 version.
A newer version
v0.33.1 is available.
Activation functions
Customized activation functions for supporting various models in 🤗 Diffusers.
GELU
class diffusers.models.activations.GELU
< source >( dim_in: intdim_out: intapproximate: str = 'none'bias: bool = True )
GELU activation function with tanh approximation support with approximate="tanh"
.
GEGLU
class diffusers.models.activations.GEGLU
< source >( dim_in: intdim_out: intbias: bool = True )
A variant of the gated linear unit activation function.
ApproximateGELU
class diffusers.models.activations.ApproximateGELU
< source >( dim_in: intdim_out: intbias: bool = True )
The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.