Skip to content

Commit de02280

Browse files
authored
[Docathon][Update Doc No.21] update activation (#7297)
1 parent c3eb9f3 commit de02280

File tree

2 files changed

+60
-57
lines changed

2 files changed

+60
-57
lines changed
Lines changed: 31 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,40 @@
11
.. _api_guide_activations:
22

3-
####
3+
###################
44
激活函数
5-
####
5+
###################
66

77
激活函数将非线性的特性引入到神经网络当中。
88

9-
PaddlePaddle Fluid 对大部分的激活函数进行了支持,其中有:
10-
11-
:ref:`cn_api_fluid_layers_relu`, :ref:`cn_api_fluid_layers_tanh`, :ref:`cn_api_fluid_layers_sigmoid`, :ref:`cn_api_fluid_layers_elu`, :ref:`cn_api_fluid_layers_relu6`, :ref:`cn_api_fluid_layers_pow`, :ref:`cn_api_fluid_layers_stanh`, :ref:`cn_api_fluid_layers_hard_sigmoid`, :ref:`cn_api_fluid_layers_swish`, :ref:`cn_api_fluid_layers_prelu`, :ref:`cn_api_fluid_layers_brelu`, :ref:`cn_api_fluid_layers_leaky_relu`, :ref:`cn_api_fluid_layers_soft_relu`, :ref:`cn_api_fluid_layers_thresholded_relu`, :ref:`cn_api_fluid_layers_maxout`, :ref:`cn_api_fluid_layers_logsigmoid`, :ref:`cn_api_fluid_layers_hard_shrink`, :ref:`cn_api_fluid_layers_softsign`, :ref:`cn_api_fluid_layers_softplus`, :ref:`cn_api_fluid_layers_tanh_shrink`, :ref:`cn_api_fluid_layers_softshrink`, :ref:`cn_api_fluid_layers_exp`。
12-
13-
14-
**Fluid 提供了两种使用激活函数的方式:**
15-
16-
- 如果一个层的接口提供了 :code:`act` 变量(默认值为 None),我们可以通过该变量指定该层的激活函数类型。该方式支持常见的激活函数: :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`。
17-
18-
.. code-block:: python
19-
20-
conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3, act="relu")
21-
22-
23-
- Fluid 为每个 Activation 提供了接口,我们可以显式的对它们进行调用。
9+
PaddlePaddle 对大部分的激活函数进行了支持,其中有:
10+
11+
* :ref:`cn_api_paddle_nn_functional_elu`
12+
* :ref:`cn_api_paddle_exp`
13+
* :ref:`cn_api_paddle_nn_functional_hardsigmoid`
14+
* :ref:`cn_api_paddle_nn_functional_hardshrink`
15+
* :ref:`cn_api_paddle_nn_functional_leaky_relu`
16+
* :ref:`cn_api_paddle_nn_functional_log_sigmoid`
17+
* :ref:`cn_api_paddle_nn_functional_maxout`
18+
* :ref:`cn_api_paddle_pow`
19+
* :ref:`cn_api_paddle_nn_functional_prelu`
20+
* :ref:`cn_api_paddle_nn_functional_relu`
21+
* :ref:`cn_api_paddle_nn_functional_relu6`
22+
* :ref:`cn_api_paddle_nn_functional_sigmoid`
23+
* :ref:`cn_api_paddle_nn_functional_softplus`
24+
* :ref:`cn_api_paddle_nn_functional_softshrink`
25+
* :ref:`cn_api_paddle_nn_functional_softsign`
26+
* :ref:`cn_api_paddle_stanh`
27+
* :ref:`cn_api_paddle_nn_functional_swish`
28+
* :ref:`cn_api_paddle_tanh`
29+
* :ref:`cn_api_paddle_nn_functional_thresholded_relu`
30+
* :ref:`cn_api_paddle_nn_functional_tanhshrink`
31+
32+
33+
**PaddlePaddle 应用激活函数的方式如下:**
34+
35+
PaddlePaddle 为每个 Activation 提供了接口,可以显式调用。以下是一个示例,展示如何在卷积操作之后应用 ReLU 激活函数:
2436

2537
.. code-block:: python
2638
27-
conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3)
28-
relu1 = fluid.layers.relu(conv2d)
39+
conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1) # 卷积
40+
relu1 = paddle.nn.functional.relu(conv2d) # 使用 ReLu 激活函数

docs/api_guides/low_level/layers/activations_en.rst

Lines changed: 29 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -6,44 +6,35 @@ Activation Function
66

77
The activation function incorporates non-linearity properties into the neural network.
88

9-
PaddlePaddle Fluid supports most of the activation functions, including:
10-
11-
:ref:`api_fluid_layers_relu`,
12-
:ref:`api_fluid_layers_tanh`,
13-
:ref:`api_fluid_layers_sigmoid`,
14-
:ref:`api_fluid_layers_elu`,
15-
:ref:`api_fluid_layers_relu6`,
16-
:ref:`api_fluid_layers_pow`,
17-
:ref:`api_fluid_layers_stanh`,
18-
:ref:`api_fluid_layers_hard_sigmoid`,
19-
:ref:`api_fluid_layers_swish`,
20-
:ref:`api_fluid_layers_prelu`,
21-
:ref:`api_fluid_layers_brelu`,
22-
:ref:`api_fluid_layers_leaky_relu`,
23-
:ref:`api_fluid_layers_soft_relu`,
24-
:ref:`api_fluid_layers_thresholded_relu`,
25-
:ref:`api_fluid_layers_maxout`,
26-
:ref:`api_fluid_layers_logsigmoid`,
27-
:ref:`api_fluid_layers_hard_shrink`,
28-
:ref:`api_fluid_layers_softsign`,
29-
:ref:`api_fluid_layers_softplus`,
30-
:ref:`api_fluid_layers_tanh_shrink`,
31-
:ref:`api_fluid_layers_softshrink`,
32-
:ref:`api_fluid_layers_exp`.
33-
34-
35-
**Fluid provides two ways to use the activation function:**
36-
37-
- If a layer interface provides :code:`act` variables (default None), we can specify the type of layer activation function through this parameter. This mode supports common activation functions :code:`relu`, :code:`tanh`, :code:`sigmoid`, :code:`identity`.
9+
PaddlePaddle supports most of the activation functions, including:
10+
11+
* :ref:`api_paddle_nn_functional_elu`
12+
* :ref:`api_paddle_exp`
13+
* :ref:`api_paddle_nn_functional_hardsigmoid`
14+
* :ref:`api_paddle_nn_functional_hardshrink`
15+
* :ref:`api_paddle_nn_functional_leaky_relu`
16+
* :ref:`api_paddle_nn_functional_log_sigmoid`
17+
* :ref:`api_paddle_nn_functional_maxout`
18+
* :ref:`api_paddle_pow`
19+
* :ref:`api_paddle_nn_functional_prelu`
20+
* :ref:`api_paddle_nn_functional_relu`
21+
* :ref:`api_paddle_nn_functional_relu6`
22+
* :ref:`api_paddle_tensor_sigmoid`
23+
* :ref:`api_paddle_nn_functional_softplus`
24+
* :ref:`api_paddle_nn_functional_softshrink`
25+
* :ref:`api_paddle_nn_functional_softsign`
26+
* :ref:`api_paddle_stanh`
27+
* :ref:`api_paddle_nn_functional_swish`
28+
* :ref:`api_paddle_tanh`
29+
* :ref:`api_paddle_nn_functional_thresholded_relu`
30+
* :ref:`api_paddle_nn_functional_tanhshrink`
31+
32+
33+
**The way to apply activation functions in PaddlePaddle is as follows:**
34+
35+
PaddlePaddle provides a dedicated interface for each activation function, allowing users to explicitly invoke them as needed. Below is an example of applying the ReLU activation function after a convolution operation:
3836

3937
.. code-block:: python
4038
41-
conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3, act="relu")
42-
43-
44-
- Fluid provides an interface for each Activation, and we can explicitly call it.
45-
46-
.. code-block:: python
47-
48-
conv2d = fluid.layers.conv2d(input=data, num_filters=2, filter_size=3)
49-
relu1 = fluid.layers.relu(conv2d)
39+
conv2d = paddle.nn.functional.conv2d(x, weight, stride=1, padding=1) # Convolution operation
40+
relu1 = paddle.nn.functional.relu(conv2d) # Applying the ReLU activation function

0 commit comments

Comments
 (0)