Skip to content

Commit bb09024

Browse files
committed
minor changes
1 parent 5995d9a commit bb09024

File tree

2 files changed

+12
-28
lines changed

2 files changed

+12
-28
lines changed

README.md

Lines changed: 11 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -9,16 +9,17 @@
99
1) Sigmoid
1010
2) Tanh
1111
3) Softmax
12-
4) Softplus
13-
5) Softsign
14-
6) Swish
15-
7) Mish
16-
8) TanhExp
17-
9) ReLU
18-
10) LeakyReLU
19-
11) ELU
20-
12) SELU
21-
13) GELU
12+
4) LogSoftmax
13+
5) Softplus
14+
6) Softsign
15+
7) Swish
16+
8) Mish
17+
9) TanhExp
18+
10) ReLU
19+
11) LeakyReLU
20+
12) ELU
21+
13) SELU
22+
14) GELU
2223

2324
*[See Activation Functions...](neunet/nn/activations.py)*
2425

neunet/nn/activations.py

Lines changed: 1 addition & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -499,21 +499,4 @@ def forward(self, x: Tensor):
499499
return _LogSoftmax(f_x, [x, f_x, self.axis], "log_softmax", device=x.device)
500500

501501
def __call__(self, x):
502-
return self.forward(x)
503-
504-
505-
activations = {
506-
"sigmoid": Sigmoid(),
507-
"tanh": Tanh(),
508-
"softmax": Softmax(),
509-
"softplus": Softplus(),
510-
"softsign": Softsign(),
511-
"swish": Swish(),
512-
"mish": Mish(),
513-
"tanh_exp": TanhExp(),
514-
"relu": ReLU(),
515-
"leaky_relu": LeakyReLU(),
516-
"elu": ELU(),
517-
"selu": SELU(),
518-
"gelu": GELU(),
519-
}
502+
return self.forward(x)

0 commit comments

Comments
 (0)