Table of Contents ▸ List of Commands ▸ Neural Networks ▸ nn_nl | ◀ nn_mul | nn_normalize ▶ |
nn_nl
Arguments:
- out,_in,_activation
Description:
Add a nl (nonlinearity) layer to the current network.activation can be { elu | gelu | leakyrelu | linear | relu | sigmoid | sin | sinc | softmax | sqr | sqrt | swish | tanh }.
Default values:
in=. (previous layer) and activation=leakyrelu.