Bernstein_Poster_2024/basis_mlp_x16/L1NormLayer.py

14 lines
319 B
Python
Raw Permalink Normal View History

2024-11-05 18:20:02 +01:00
import torch
class L1NormLayer(torch.nn.Module):
epsilon: float
def __init__(self, epsilon: float = 10e-20) -> None:
super().__init__()
self.epsilon = epsilon
def forward(self, input: torch.Tensor) -> torch.Tensor:
return input / (input.sum(dim=1, keepdim=True) + self.epsilon)