Skip to content

Reference for ultralytics/nn/modules/activation.py

Note

This file is available at https://github.com/ultralytics/ultralytics/blob/main/ultralytics/nn/modules/activation.py. If you spot a problem please help fix it by contributing a Pull Request 🛠️. Thank you 🙏!


ultralytics.nn.modules.activation.AGLU

AGLU(device=None, dtype=None)

Bases: Module

Unified activation function module from https://github.com/kostas1515/AGLU.

Source code in ultralytics/nn/modules/activation.py
def __init__(self, device=None, dtype=None) -> None:
    """Initialize the Unified activation function."""
    super().__init__()
    self.act = nn.Softplus(beta=-1.0)
    self.lambd = nn.Parameter(nn.init.uniform_(torch.empty(1, device=device, dtype=dtype)))  # lambda parameter
    self.kappa = nn.Parameter(nn.init.uniform_(torch.empty(1, device=device, dtype=dtype)))  # kappa parameter

forward

forward(x: torch.Tensor) -> torch.Tensor

Compute the forward pass of the Unified activation function.

Source code in ultralytics/nn/modules/activation.py
def forward(self, x: torch.Tensor) -> torch.Tensor:
    """Compute the forward pass of the Unified activation function."""
    lam = torch.clamp(self.lambd, min=0.0001)
    return torch.exp((1 / lam) * self.act((self.kappa * x) - torch.log(lam)))



📅 Created 4 months ago ✏️ Updated 3 months ago