You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/api_guides/low_level/layers/activations_en.rst
+29-38Lines changed: 29 additions & 38 deletions
Original file line number
Diff line number
Diff line change
@@ -6,44 +6,35 @@ Activation Function
6
6
7
7
The activation function incorporates non-linearity properties into the neural network.
8
8
9
-
PaddlePaddle Fluid supports most of the activation functions, including:
10
-
11
-
:ref:`api_fluid_layers_relu`,
12
-
:ref:`api_fluid_layers_tanh`,
13
-
:ref:`api_fluid_layers_sigmoid`,
14
-
:ref:`api_fluid_layers_elu`,
15
-
:ref:`api_fluid_layers_relu6`,
16
-
:ref:`api_fluid_layers_pow`,
17
-
:ref:`api_fluid_layers_stanh`,
18
-
:ref:`api_fluid_layers_hard_sigmoid`,
19
-
:ref:`api_fluid_layers_swish`,
20
-
:ref:`api_fluid_layers_prelu`,
21
-
:ref:`api_fluid_layers_brelu`,
22
-
:ref:`api_fluid_layers_leaky_relu`,
23
-
:ref:`api_fluid_layers_soft_relu`,
24
-
:ref:`api_fluid_layers_thresholded_relu`,
25
-
:ref:`api_fluid_layers_maxout`,
26
-
:ref:`api_fluid_layers_logsigmoid`,
27
-
:ref:`api_fluid_layers_hard_shrink`,
28
-
:ref:`api_fluid_layers_softsign`,
29
-
:ref:`api_fluid_layers_softplus`,
30
-
:ref:`api_fluid_layers_tanh_shrink`,
31
-
:ref:`api_fluid_layers_softshrink`,
32
-
:ref:`api_fluid_layers_exp`.
33
-
34
-
35
-
**Fluid provides two ways to use the activation function:**
36
-
37
-
- If a layer interface provides :code:`act` variables (default None), we can specify the type of layer activation function through this parameter. This mode supports common activation functions :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`.
9
+
PaddlePaddle supports most of the activation functions, including:
**The way to apply activation functions in PaddlePaddle is as follows:**
34
+
35
+
PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation:
0 commit comments