FNN 1.0.0
Toolbox to use NNs in Fortran.
Loading...
Searching...
No Matches
Data Types | Functions/Subroutines
fnn_activation_relu Module Reference

Module dedicated to the class reluactivation. More...

Data Types

type  reluactivation
 Implements a relu activation function. More...
 

Functions/Subroutines

type(reluactivation) function, public construct_relu_activation (self_size, batch_size)
 Constructor for class reluactivation. More...
 
subroutine relu_tofile (self, unit_num)
 Implements reluactivation::tofile. More...
 
subroutine relu_apply_forward (self, member, z, y)
 Implements reluactivation::apply_forward. More...
 

Detailed Description

Module dedicated to the class reluactivation.

Function/Subroutine Documentation

◆ construct_relu_activation()

type(reluactivation) function, public fnn_activation_relu::construct_relu_activation ( integer(ik), intent(in)  self_size,
integer(ik), intent(in)  batch_size 
)

Constructor for class reluactivation.

Parameters
[in]self_sizeThe value for linearactivation::self_size.
[in]batch_sizeThe value for linearactivation::batch_size.
Returns
The constructed activation function.

◆ relu_apply_forward()

subroutine fnn_activation_relu::relu_apply_forward ( class(reluactivation), intent(inout)  self,
integer(ik), intent(in)  member,
real(rk), dimension(:), intent(in)  z,
real(rk), dimension(:), intent(out)  y 
)
private

Implements reluactivation::apply_forward.

Applies and linearises the activation function.

The activation function reads

\[ \mathbf{y} = \mathcal{A}(\mathbf{z}) = \mathrm{relu}(\mathbf{z}),\]

and the associated linearisation reads

\[ \mathbf{A}(\mathbf{z}) = \mathrm{diag}(1-\mathrm{relu}(\mathbf{z})^2).\]

Note

Input parameter member should be less than linearactivation::batch_size.

The linarisation is stored in nonlinearactivation::z_prime, which is why the intent of self is declared inout.

Parameters
[in,out]selfThe activation function.
[in]memberThe index inside the batch.
[in]zThe input of the activation function.
[out]yThe output of the activation function.

◆ relu_tofile()

subroutine fnn_activation_relu::relu_tofile ( class(reluactivation), intent(in)  self,
integer(ik), intent(in)  unit_num 
)
private

Implements reluactivation::tofile.

Saves the activation function.

Parameters
[in]selfThe activation function to save.
[in]unit_numThe unit number for the write statement.