FNN 1.0.0
Toolbox to use NNs in Fortran.
|
Module dedicated to the class dropoutlayer. More...
Data Types | |
type | dropoutlayer |
Implements a dropout layer. More... | |
Functions/Subroutines | |
type(dropoutlayer) function, public | dropout_layer_fromfile (batch_size, unit_num) |
Constructor for class dropoutlayer from a file. More... | |
subroutine | dropout_tofile (self, unit_num) |
Implements dropoutlayer::tofile. More... | |
subroutine | dropout_apply_forward (self, train, member, x, y) |
Implements dropoutlayer::apply_forward. More... | |
subroutine | dropout_apply_tangent_linear (self, member, dp, dx, dy) |
Implements dropoutlayer::apply_tangent_linear. More... | |
subroutine | dropout_apply_adjoint (self, member, dy, dp, dx) |
Implements dropoutlayer::apply_adjoint. More... | |
Module dedicated to the class dropoutlayer.
|
private |
Implements dropoutlayer::apply_adjoint.
Applies the adjoint of the layer.
The adjoint operator reads
Note
This method should only be called after dropoutlayer::apply_forward.
Since there is no (trainable) parameters, the parameter perturbation should be an empty array.
The intent of dy
is declared inout
instead of in
because of other subclasses of fnn_layer::layer.
[in,out] | self | The layer. |
[in] | member | The index inside the batch. |
[in,out] | dy | The output perturbation. |
[out] | dp | The parameter perturbation. |
[out] | dx | The state perturbation. |
|
private |
Implements dropoutlayer::apply_forward.
Applies and linearises the layer.
The forward function reads
where
In training mode,
In testing mode,
Note
Input parameter member
should be less than layer::batch_size.
The linearisation is trivial and only requires to store the values of
[in,out] | self | The layer. |
[in] | train | Whether the model is used in training mode. |
[in] | member | The index inside the batch. |
[in] | x | The input of the layer. |
[out] | y | The output of the layer. |
|
private |
Implements dropoutlayer::apply_tangent_linear.
Applies the TL of the layer.
The TL operator reads
Note
This method should only be called after dropoutlayer::apply_forward.
Since there is no (trainable) parameters, the parameter perturbation should be an empty array.
[in] | self | The layer. |
[in] | member | The index inside the batch. |
[in] | dp | The parameter perturbation. |
[in] | dx | The state perturbation. |
[out] | dy | The output perturbation. |
type(dropoutlayer) function, public fnn_layer_dropout::dropout_layer_fromfile | ( | integer(ik), intent(in) | batch_size, |
integer(ik), intent(in) | unit_num | ||
) |
Constructor for class dropoutlayer from a file.
[in] | batch_size | The value for layer::batch_size. |
[in] | unit_num | The unit number for the read statements. |
|
private |
Implements dropoutlayer::tofile.
Saves the layer.
[in] | self | The layer. |
[in] | unit_num | The unit number for the write statement. |