public class RecurrentAttentionLayer extends SameDiffLayer
LearnedSelfAttentionLayer
,
SelfAttentionLayer
,
MultiHeadDotProductAttention
,
Serialized FormModifier and Type | Class and Description |
---|---|
static class |
RecurrentAttentionLayer.Builder |
paramWeightInit, weightInit
biasUpdater, gradientNormalization, gradientNormalizationThreshold, regularization, regularizationBias, updater
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
RecurrentAttentionLayer(RecurrentAttentionLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
void |
applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig) |
SDVariable |
defineLayer(SameDiff sameDiff,
SDVariable layerInput,
Map<String,SDVariable> paramTable,
SDVariable mask)
Define the layer
|
void |
defineParameters(SDLayerParams params)
Define the parameters for the network.
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
|
void |
initializeParameters(Map<String,INDArray> params)
Set the initial parameter values for this layer, if required
|
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
void |
validateInput(INDArray input)
Validate input arrays to confirm that they fulfill the assumptions of the layer.
|
feedForwardMaskArray, instantiate
applyGlobalConfig, getLayerParams, getMemoryReport, getRegularizationByParam, getUpdaterByParam, initializer, initWeights, isPretrainParam, onesMaskForInput, paramReshapeOrder
clone, initializeConstraints, resetLayerDefaultConfig, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalization, getGradientNormalizationThreshold, getLayerName
protected RecurrentAttentionLayer(RecurrentAttentionLayer.Builder builder)
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class AbstractSameDiffLayer
inputType
- InputType to this layerpublic void setNIn(InputType inputType, boolean override)
Layer
setNIn
in class AbstractSameDiffLayer
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it
regardless of whether it's already set or not.public InputType getOutputType(int layerIndex, InputType inputType)
Layer
getOutputType
in class Layer
layerIndex
- Index of the layerinputType
- Type of input for the layerpublic void defineParameters(SDLayerParams params)
AbstractSameDiffLayer
SDLayerParams.addWeightParam(String, long...)
and SDLayerParams.addBiasParam(String, long...)
defineParameters
in class AbstractSameDiffLayer
params
- Object used to set parameters for this layerpublic void initializeParameters(Map<String,INDArray> params)
AbstractSameDiffLayer
initializeParameters
in class AbstractSameDiffLayer
params
- Parameter arrays that may be initializedpublic void applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
applyGlobalConfigToLayer
in class AbstractSameDiffLayer
public void validateInput(INDArray input)
SameDiffLayer
validateInput
in class SameDiffLayer
input
- input to the layerpublic SDVariable defineLayer(SameDiff sameDiff, SDVariable layerInput, Map<String,SDVariable> paramTable, SDVariable mask)
SameDiffLayer
defineLayer
in class SameDiffLayer
sameDiff
- SameDiff instancelayerInput
- Input to the layerparamTable
- Parameter table - keys as defined by AbstractSameDiffLayer.defineParameters(SDLayerParams)
mask
- Optional, maybe null. Mask to apply if supportedCopyright © 2020. All rights reserved.