# Package org.nd4j.linalg.api.ops.impl.transforms.strict

• Class Summary
Class Description
ACos
Log elementwise function
ACosh
ACosh elementwise function
ASin
Arcsin elementwise function
ASinh
Arcsin elementwise function
ATan
Arc Tangent elementwise function
ATanh
tan elementwise function
Cos
Cosine elementwise function
Cosh
Cosine Hyperbolic elementwise function
ELU
ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper:
Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
Djork-ArnĂ© Clevert, Thomas Unterthiner, Sepp Hochreiter (2015)
https://arxiv.org/abs/1511.07289
Erf
Gaussian error function (erf) function, which is defined as erf(x) = 1 / sqrt(pi) * integral_(-x, x) exp(-t^2) dt
Erfc
Complementary Gaussian error function (erfc), defined as erfc(x) = 1 - erf(x) where erf denotes regular Gaussian error.
Exp
Element-wise exponential function
Expm1
Element-wise exponential function minus 1, i.e. for each element x in a tensor computes the transformation exp(x) - 1.
GELU
GELU activation function - Gaussian Error Linear Units
For more details, see Gaussian Error Linear Units (GELUs) - https://arxiv.org/abs/1606.08415 Note: This op implements both the sigmoid and tanh-based approximations; to use the sigmoid approximation (recommended) use precise=false; otherwise, use precise = true for the slower but marginally more accurate tanh version.
GELUDerivative
GELU derivative
HardSigmoid
HardSigmoid function
HardTanh
Hard tanh elementwise function
Log
Log elementwise function
Log1p
Log1p function
LogSigmoid
LogSigmoid function
Mish
Mish activation function
MishDerivative
Mish derivative
PreciseGELU
GELU activation function - Gaussian Error Linear Units
For more details, see Gaussian Error Linear Units (GELUs) - https://arxiv.org/abs/1606.08415 Note: This op implements both the sigmoid and tanh-based approximations; to use the sigmoid approximation (recommended) use precise=false; otherwise, use precise = true for the slower but marginally more accurate tanh version.
PreciseGELUDerivative
GELU derivative
RationalTanh
Rational Tanh Approximation elementwise function, as described at https://github.com/deeplearning4j/libnd4j/issues/351
RectifiedTanh
RectifiedTanh Essentially max(0, tanh(x))
Rint
Rint function
SELU
SELU activation function https://arxiv.org/pdf/1706.02515.pdf
SetRange
Set range to a particular set of values
Sigmoid
Sigmoid function
SigmoidDerivative Deprecated
Sin
Log elementwise function
Sinh
Sinh function
SoftPlus
SoftSign
Softsign element-wise activation function. f(x) = x/(1+abs(x))
Similar in shape to tanh but may outperform it due to 'gentler' nonlinearity (smoother asymptotes).
Stabilize
Stabilization function, forces values to be within a range
Swish
Swish function
SwishDerivative
Swish derivative
Tan
Tanh elementwise function
TanDerivative
Tan Derivative elementwise function
Tanh
Tanh elementwise function
TanhDerivative Deprecated