## Uses of Classorg.nd4j.linalg.api.ops.BaseTransformStrictOp

• Packages that use BaseTransformStrictOp
Package Description
org.nd4j.linalg.api.ops.impl.transforms.strict
• ### Uses of BaseTransformStrictOp in org.nd4j.linalg.api.ops.impl.transforms.gradient

Modifier and Type Class and Description
`class ` `CubeDerivative`
Deprecated.
`class ` `HardSigmoidDerivative`
Deprecated.
`class ` `HardTanhDerivative`
Deprecated.
`class ` `RationalTanhDerivative`
Deprecated.
`class ` `RectifiedTanhDerivative`
Deprecated.
`class ` `SELUDerivative`
Deprecated.
`class ` `SoftSignDerivative`
Deprecated.
• ### Uses of BaseTransformStrictOp in org.nd4j.linalg.api.ops.impl.transforms.strict

Modifier and Type Class and Description
`class ` `ACos`
Log elementwise function
`class ` `ACosh`
ACosh elementwise function
`class ` `ASin`
Arcsin elementwise function
`class ` `ASinh`
Arcsin elementwise function
`class ` `ATan`
Arc Tangent elementwise function
`class ` `ATanh`
tan elementwise function
`class ` `Cos`
Cosine elementwise function
`class ` `Cosh`
Cosine Hyperbolic elementwise function
`class ` `Erf`
Gaussian error function (erf) function, which is defined as erf(x) = 1 / sqrt(pi) * integral_(-x, x) exp(-t^2) dt
`class ` `Erfc`
Complementary Gaussian error function (erfc), defined as erfc(x) = 1 - erf(x) where erf denotes regular Gaussian error.
`class ` `Exp`
Element-wise exponential function
`class ` `Expm1`
Element-wise exponential function minus 1, i.e. for each element x in a tensor computes the transformation exp(x) - 1.
`class ` `GELU`
GELU activation function - Gaussian Error Linear Units
For more details, see Gaussian Error Linear Units (GELUs) - https://arxiv.org/abs/1606.08415 Note: This op implements both the sigmoid and tanh-based approximations; to use the sigmoid approximation (recommended) use precise=false; otherwise, use precise = true for the slower but marginally more accurate tanh version.
`class ` `GELUDerivative`
GELU derivative
`class ` `HardSigmoid`
HardSigmoid function
`class ` `HardTanh`
Hard tanh elementwise function
`class ` `Log`
Log elementwise function
`class ` `Log1p`
Log1p function
`class ` `LogSigmoid`
LogSigmoid function
`class ` `Mish`
Mish activation function
`class ` `MishDerivative`
Mish derivative
`class ` `PreciseGELU`
GELU activation function - Gaussian Error Linear Units
For more details, see Gaussian Error Linear Units (GELUs) - https://arxiv.org/abs/1606.08415 Note: This op implements both the sigmoid and tanh-based approximations; to use the sigmoid approximation (recommended) use precise=false; otherwise, use precise = true for the slower but marginally more accurate tanh version.
`class ` `PreciseGELUDerivative`
GELU derivative
`class ` `RationalTanh`
Rational Tanh Approximation elementwise function, as described at https://github.com/deeplearning4j/libnd4j/issues/351
`class ` `RectifiedTanh`
RectifiedTanh Essentially max(0, tanh(x))
`class ` `Rint`
Rint function
`class ` `SELU`
SELU activation function https://arxiv.org/pdf/1706.02515.pdf
`class ` `SetRange`
Set range to a particular set of values
`class ` `Sigmoid`
Sigmoid function
`class ` `SigmoidDerivative`
Deprecated.
`class ` `Sin`
Log elementwise function
`class ` `Sinh`
Sinh function
`class ` `SoftPlus`
`class ` `SoftSign`
Softsign element-wise activation function. f(x) = x/(1+abs(x))
Similar in shape to tanh but may outperform it due to 'gentler' nonlinearity (smoother asymptotes).
`class ` `Stabilize`
Stabilization function, forces values to be within a range
`class ` `Swish`
Swish function
`class ` `SwishDerivative`
Swish derivative
`class ` `Tan`
Tanh elementwise function
`class ` `TanDerivative`
Tan Derivative elementwise function
`class ` `Tanh`
Tanh elementwise function
`class ` `TanhDerivative`
Deprecated.