FNN 1.0.0
Toolbox to use NNs in Fortran.
Loading...
Searching...
No Matches
Data Types | Functions/Subroutines
fnn_layer_dropout Module Reference

Module dedicated to the class dropoutlayer. More...

Data Types

type  dropoutlayer
 Implements a dropout layer. More...
 

Functions/Subroutines

type(dropoutlayer) function, public dropout_layer_fromfile (batch_size, unit_num)
 Constructor for class dropoutlayer from a file. More...
 
subroutine dropout_tofile (self, unit_num)
 Implements dropoutlayer::tofile. More...
 
subroutine dropout_apply_forward (self, train, member, x, y)
 Implements dropoutlayer::apply_forward. More...
 
subroutine dropout_apply_tangent_linear (self, member, dp, dx, dy)
 Implements dropoutlayer::apply_tangent_linear. More...
 
subroutine dropout_apply_adjoint (self, member, dy, dp, dx)
 Implements dropoutlayer::apply_adjoint. More...
 

Detailed Description

Module dedicated to the class dropoutlayer.

Function/Subroutine Documentation

◆ dropout_apply_adjoint()

subroutine fnn_layer_dropout::dropout_apply_adjoint ( class(dropoutlayer), intent(in)  self,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(inout)  dy,
real(rk), dimension(:), intent(out)  dp,
real(rk), dimension(:), intent(out)  dx 
)
private

Implements dropoutlayer::apply_adjoint.

Applies the adjoint of the layer.

The adjoint operator reads

\[d\mathbf{x} = \mathbf{z}*d\mathbf{y}.\]

Note

This method should only be called after dropoutlayer::apply_forward.

Since there is no (trainable) parameters, the parameter perturbation should be an empty array.

The intent of dy is declared inout instead of in because of other subclasses of fnn_layer::layer.

Parameters
[in,out]selfThe layer.
[in]memberThe index inside the batch.
[in,out]dyThe output perturbation.
[out]dpThe parameter perturbation.
[out]dxThe state perturbation.

◆ dropout_apply_forward()

subroutine fnn_layer_dropout::dropout_apply_forward ( class(dropoutlayer), intent(inout)  self,
logical, intent(in)  train,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(in)  x,
real(rk), dimension(:), intent(out)  y 
)
private

Implements dropoutlayer::apply_forward.

Applies and linearises the layer.

The forward function reads

\[\mathbf{y} = \mathbf{z}*\mathbf{x},\]

where $*$ is the element-wise multiplication for vectors.

In training mode, $z_{i}=0$ with probability $p$ and $z_{i}=1/(1-p)$ with probability $1-p$ where $p$ is the dropout rate.

In testing mode, $z_{i}=1$.

Note

Input parameter member should be less than layer::batch_size.

The linearisation is trivial and only requires to store the values of $\mathbf{z}$ in dropoutlayer::draws.

Parameters
[in,out]selfThe layer.
[in]trainWhether the model is used in training mode.
[in]memberThe index inside the batch.
[in]xThe input of the layer.
[out]yThe output of the layer.

◆ dropout_apply_tangent_linear()

subroutine fnn_layer_dropout::dropout_apply_tangent_linear ( class(dropoutlayer), intent(in)  self,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(in)  dp,
real(rk), dimension(:), intent(in)  dx,
real(rk), dimension(:), intent(out)  dy 
)
private

Implements dropoutlayer::apply_tangent_linear.

Applies the TL of the layer.

The TL operator reads

\[d\mathbf{y} = \mathbf{z}*d\mathbf{x}.\]

Note

This method should only be called after dropoutlayer::apply_forward.

Since there is no (trainable) parameters, the parameter perturbation should be an empty array.

Parameters
[in]selfThe layer.
[in]memberThe index inside the batch.
[in]dpThe parameter perturbation.
[in]dxThe state perturbation.
[out]dyThe output perturbation.

◆ dropout_layer_fromfile()

type(dropoutlayer) function, public fnn_layer_dropout::dropout_layer_fromfile ( integer(ik), intent(in)  batch_size,
integer(ik), intent(in)  unit_num 
)

Constructor for class dropoutlayer from a file.

Parameters
[in]batch_sizeThe value for layer::batch_size.
[in]unit_numThe unit number for the read statements.
Returns
The constructed layer.

◆ dropout_tofile()

subroutine fnn_layer_dropout::dropout_tofile ( class(dropoutlayer), intent(in)  self,
integer(ik), intent(in)  unit_num 
)
private

Implements dropoutlayer::tofile.

Saves the layer.

Parameters
[in]selfThe layer.
[in]unit_numThe unit number for the write statement.