FNN 1.0.0
Toolbox to use NNs in Fortran.
Loading...
Searching...
No Matches
Data Types | Functions/Subroutines
fnn_layer Module Reference

Module dedicated to the class layer. More...

Data Types

type  layer
 Base class for all layers. Do not instanciate. More...
 

Functions/Subroutines

integer(ik) function layer_get_num_parameters (self)
 Implements layer::get_num_parameters. More...
 
subroutine layer_set_parameters (self, new_parameters)
 Implements layer::set_parameters. More...
 
subroutine layer_get_parameters (self, parameters)
 Implements layer::get_parameters. More...
 
subroutine layer_tofile (self, unit_num)
 Implements layer::tofile. More...
 
subroutine layer_apply_forward (self, train, member, x, y)
 Implements layer::apply_forward. More...
 
subroutine layer_apply_tangent_linear (self, member, dp, dx, dy)
 Implements layer::apply_tangent_linear. More...
 
subroutine layer_apply_adjoint (self, member, dy, dp, dx)
 Implements layer::apply_adjoint. More...
 

Detailed Description

Module dedicated to the class layer.

Function/Subroutine Documentation

◆ layer_apply_adjoint()

subroutine fnn_layer::layer_apply_adjoint ( class(layer), intent(in)  self,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(inout)  dy,
real(rk), dimension(:), intent(out)  dp,
real(rk), dimension(:), intent(out)  dx 
)
private

Implements layer::apply_adjoint.

Applies the adjoint of the layer.

The adjoint operator reads

\[ d\mathbf{p} = [\mathbf{F}^\mathrm{p}]^\top d\mathbf{y},\]

\[ d\mathbf{x} = [\mathbf{F}^\mathrm{x}]^\top d\mathbf{y},\]

where $\mathbf{F}^\mathrm{p}$ is the TL of $\mathcal{F}$ with respect to the $\mathbf{p}$ component and $\mathbf{F}^\mathrm{x}$ is the TL of $\mathcal{F}$ with respect to the $\mathbf{x}$ component.

Note

This should be overridden by each subclass.

The intent of dy is declared inout instead of in because, for certain subclasses (e.g. fnn_layer_dense::denselayer) the value of dy gets overwritten during the process (bad side-effect).

Parameters
[in]selfThe layer.
[in]memberThe index inside the batch.
[in,out]dyThe output perturbation.
[out]dpThe parameter perturbation.
[out]dxThe state perturbation.

◆ layer_apply_forward()

subroutine fnn_layer::layer_apply_forward ( class(layer), intent(inout)  self,
logical, intent(in)  train,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(in)  x,
real(rk), dimension(:), intent(out)  y 
)
private

Implements layer::apply_forward.

Applies and linearises the layer.

The forward function reads

\[ \mathbf{y} = \mathcal{F}(\mathbf{p}, \mathbf{x}),\]

where $\mathbf{p}$ is the vector of parameters.

Note

This should be overridden by each subclass.

The intent for self is declared inout instead of in because, for certain subclasses (e.g. fnn_layer_dense::denselayer) the linearisation is stored inside the layer.

Parameters
[in,out]selfThe layer.
[in]trainWhether the model is used in training mode.
[in]memberThe index inside the batch.
[in]xThe input of the layer.
[out]yThe output of the layer.

◆ layer_apply_tangent_linear()

subroutine fnn_layer::layer_apply_tangent_linear ( class(layer), intent(in)  self,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(in)  dp,
real(rk), dimension(:), intent(in)  dx,
real(rk), dimension(:), intent(out)  dy 
)
private

Implements layer::apply_tangent_linear.

Applies the TL of the layer.

The TL operator reads

\[ d\mathbf{y} = \mathbf{F}^\mathrm{p}d\mathbf{p} + 
 \mathbf{F}^\mathrm{x}d\mathbf{x},\]

where $\mathbf{F}^\mathrm{p}$ is the TL of $\mathcal{F}$ with respect to the $\mathbf{p}$ component and $\mathbf{F}^\mathrm{x}$ is the TL of $\mathcal{F}$ with respect to the $\mathbf{x}$ component.

Note

This should be overridden by each subclass.

Parameters
[in]selfThe layer.
[in]memberThe index inside the batch.
[in]dpThe parameter perturbation.
[in]dxThe state perturbation.
[out]dyThe output perturbation.

◆ layer_get_num_parameters()

integer(ik) function fnn_layer::layer_get_num_parameters ( class(layer), intent(in)  self)

Implements layer::get_num_parameters.

Returns the number of parameters.

Parameters
[in]selfThe layer.
Returns
The number of parameters.

◆ layer_get_parameters()

subroutine fnn_layer::layer_get_parameters ( class(layer), intent(in)  self,
real(rk), dimension(:), intent(out)  parameters 
)
private

Implements layer::get_parameters.

Getter for layer::parameters.

Parameters
[in]selfThe layer.
[out]parametersThe vector of parameters.

◆ layer_set_parameters()

subroutine fnn_layer::layer_set_parameters ( class(layer), intent(inout)  self,
real(rk), dimension(:), intent(in)  new_parameters 
)
private

Implements layer::set_parameters.

Setter for layer::parameters.

Parameters
[in,out]selfThe layer.
[in]new_parametersThe new values for the parameters.

◆ layer_tofile()

subroutine fnn_layer::layer_tofile ( class(layer), intent(in)  self,
integer(ik), intent(in)  unit_num 
)
private

Implements layer::tofile.

Saves the layer.

Note

This should be overridden by each subclass.

Parameters
[in]selfThe layer.
[in]unit_numThe unit number for the write statement.