AttentionModule

class AttentionModule(name, in_channels, reduction=16)[source]

Attention module to apply attention mechanism on feature maps.

Initialize the Attention module.

Parameters:
  • name (str | None) – Name of the attention mechanism. Only “scse” is implemented. If None, identity is used.

  • in_channels (int) – Number of input channels.

  • reduction (int) – Reduction ratio for channel attention.

Methods

forward

Forward pass of the Attention module.

Attributes

training

forward(x)[source]

Forward pass of the Attention module.

Parameters:
Returns:

Output feature map after applying attention.

Return type:

torch.Tensor