This function is a DSL function, kind of like ggplot2::aes(), that helps to
specify activation functions for neural network layers. It validates that
activation functions exist in torch and that any parameters match the
function's formal arguments.
Arguments
- ...
Activation function specifications. Can be:
Bare symbols:
relu,tanhCharacter strings (simple):
"relu","tanh"Character strings (with params):
"softshrink(lambda = 0.1)","rrelu(lower = 1/5, upper = 1/4)"Named with parameters:
softmax = args(dim = 2L)Indexed syntax (named):
softshrink[lambd = 0.2],rrelu[lower = 1/5, upper = 1/4]Indexed syntax (unnamed):
softshrink[0.5],elu[0.5]
