Activations

Enums

enum transfer_t

Values:

enumerator logistic
enumerator loggy
enumerator relu
enumerator elu
enumerator relie
enumerator ramp
enumerator linear
enumerator Tanh
enumerator plse
enumerator leaky
enumerator stair
enumerator hardtan
enumerator lhtan
enumerator selu
enumerator elliot
enumerator symm_elliot
enumerator softplus
enumerator softsign
enumerator asymm_logistic
enumerator sigmoid
namespace transfer

Functions

float linear(const float &x)

Linear activation function.

The activation function follows the equation:

f(x) = x

Parameters

x – Input variable.

Returns

The activated input.

float g_linear(const float &x)

Gradient of the Linear activation function.

The gradient is equal to:

f'(x) = 1

Parameters

x – Input variable.

Returns

The gradient of the input.

float stair(const float &x)

Stair activation function.

The activation function follows the equation:

f(x) = ...

Parameters

x – Input variable.

Returns

The activated input.

float g_stair(const float &x)

Gradient of the Stair activation function.

The gradient is equal to:

f'(x) = ...

Parameters

x – Input variable.

Returns

The gradient of the input.

float hardtan(const float &x)

HardTan activation function.

The activation function follows the equation:

if x < -2.5:
  return 0.
elif x > 2.5:
  return 1.
else:
  retun 0.2 * x + 0.5

Parameters

x – Input variable.

Returns

The activated input.

float g_hardtan(const float &x)

Gradient of the HardTan activation function.

The gradient is equal to:

if x > -2.5 and x < 2.5:
  return 0.2
else:
  return 0.0

Parameters

x – Input variable.

Returns

The gradient of the input.

float logistic(const float &x)

Logistic (sigmoid) activation function.

The activation function follows the equation:

f(x) = 1. / (1. + exp(-x))

Parameters

x – Input variable.

Returns

The activated input.

float g_logistic(const float &x)

Gradient of the Logistic activation function.

The gradient is equal to:

f'(x) = (1. - x) * x

Parameters

x – Input variable.

Returns

The gradient of the input.

float loggy(const float &x)

Loggy activation function.

The activation function follows the equation:

f(x) = 2. / (1. + exp(-x)) - 1.

Parameters

x – Input variable.

Returns

The activated input.

float g_loggy(const float &x)

Gradient of the Loggy activation function.

The gradient is equal to:

y = (x + 1.) * 0.5
f'(x) = 2. * (1. - y) * y

Parameters

x – Input variable.

Returns

The gradient of the input.

float relu(const float &x)

ReLU activation function.

The activation function follows the equation:

f(x) = max(0, x)

Parameters

x – Input variable.

Returns

The activated input.

float g_relu(const float &x)

Gradient of the ReLU activation function.

The gradient is equal to:

f'(x) = 1 if x > 0 else 0

Parameters

x – Input variable.

Returns

The gradient of the input.

float elu(const float &x)

Elu activation function.

The activation function follows the equation:

y = x >= 0
f(x) = y * x + ~y * exp(x - 1.)

Parameters

x – Input variable.

Returns

The activated input.

float g_elu(const float &x)

Gradient of the Elu activation function.

The gradient is equal to:

y = x >= 0
f'(x) = y + ~y * (x + 1.)

Parameters

x – Input variable.

Returns

The gradient of the input.

float relie(const float &x)

Relie activation function.

The activation function follows the equation:

f(x) = x if x > 0 else 0.001 * x

Parameters

x – Input variable.

Returns

The activated input.

float g_relie(const float &x)

Gradient of the Relie activation function.

The gradient is equal to:

f'(x) = 1 if x > 0 else 0.001

Parameters

x – Input variable.

Returns

The gradient of the input.

float ramp(const float &x)
float g_ramp(const float &x)
float leaky(const float &x)
float g_leaky(const float &x)
float tanhy(const float &x)
float g_tanhy(const float &x)
float plse(const float &x)
float g_plse(const float &x)
float lhtan(const float &x)
float g_lhtan(const float &x)
float selu(const float &x)
float g_selu(const float &x)
float elliot(const float &x)
float g_elliot(const float &x)
float symm_elliot(const float &x)
float g_symm_elliot(const float &x)
float softplus(const float &x)
float g_softplus(const float &x)
float softsign(const float &x)
float g_softsign(const float &x)
float asymm_logistic(const float &x)
float g_asymm_logistic(const float &x)
void swish_array(const float *x, const int &n, float *output_sigmoid, float *output)
void swish_gradient(const float *x, const int &n, const float *sigmoid, float *delta)
void mish_array(const float *x, const int &n, float *input_activation, float *output)
void mish_gradient(const int &n, const float *activation_input, float *delta)
std::function<float(const float&)> activate(const int &active)

Switch case between activation functions.

This function is used to set the desired activation function (returned as pointer to function) starting from its “name” in the enum. If the input integer is not in the enum range a nullptr is returned.

Parameters

active – Integer from the enum activation types.

Returns

Pointer to the desired function.

std::function<float(const float&)> gradient(const int &active)

Switch case between gradient functions.

This function is used to set the desired gradient function (returned as pointer to function) starting from its “name” in the enum. If the input integer is not in the enum range a nullptr is returned.

Parameters

active – Integer from the enum activation types.

Returns

Pointer to the desired function.

Variables

static const std::unordered_map<std::string, int> get_activation{{"logistic", logistic}, {"sigmoid", logistic}, {"loggy", loggy}, {"relu", relu}, {"elu", elu}, {"relie", relie}, {"ramp", ramp}, {"linear", linear}, {"tanh", Tanh}, {"plse", plse}, {"leaky", leaky}, {"stair", stair}, {"hardtan", hardtan}, {"lhtan", lhtan}, {"selu", selu}, {"elliot", elliot}, {"s_elliot", symm_elliot}, {"softplus", softplus}, {"softsign", softsign}, {"as_logistic", asymm_logistic}}

Utility for the activations management.