!37131 Fix RRelu api issue

Merge pull request !37131 from shaojunsong/fix/rrelu_api
This commit is contained in:
i-robot 2022-07-04 03:01:43 +00:00 committed by Gitee
commit 0b75102dc9
No known key found for this signature in database
GPG Key ID: 173E9B9CA92EEF8F
2 changed files with 8 additions and 8 deletions

View File

@ -9,16 +9,16 @@ mindspore.nn.RReLU
.. math::
\text{RReLU}(x_{ji}) = \begin{cases}x_{ji}, &\text{if } x_{ji} \geq 0; \cr
{\alpha_{ji}} * x, &\text{otherwise.}\end{cases}
{\alpha_{ji}} * x_{ji}, &\text{otherwise.}\end{cases}
其中,:math:`\alpha_{ji}` ~ U(l, u), :math:`l \le u`.
其中,:math:`\alpha_{ji}` ~ :math:`U(l, u)`, :math:`l \le u`.
更多细节详见 `Empirical Evaluation of Rectified Activations in Convolution Network <https://arxiv.org/pdf/1505.00853.pdf>`_
**参数:**
- **lower** (`Union[int, float]`) - x<0时激活函数的斜率的下界默认值0.125
- **upper** (`Union[int, float]`) - x<0时激活函数的斜率的上界默认值1/3。
- **lower** (Union[int, float]) - x<0时激活函数的斜率的下界默认值1/8
- **upper** (Union[int, float]) - x<0时激活函数的斜率的上界默认值1/3。
**输入:**

View File

@ -468,12 +468,12 @@ class RReLU(Cell):
.. math::
\text{RReLU}(x_{ji}) = \begin{cases}x_{ji}, &\text{if } x_{ji} \geq 0; \cr
{\alpha_{ji}} * x, &\text{otherwise.}\end{cases}
{\alpha_{ji}} * x_{ji}, &\text{otherwise.}\end{cases}
where :math:`\alpha_{ji}` ~ U(l, u), :math: `l \le u`.
where :math:`\alpha_{ji}` ~ :math:`U(l, u)`, :math: `l \le u`.
Args:
lower (Union[int, float]): Slope of the activation function at x < 0. Default: 0.125.
lower (Union[int, float]): Slope of the activation function at x < 0. Default: 1/8.
upper (Union[int, float]): Slope of the activation function at x < 0. Default: 1/3.
Inputs:
@ -505,7 +505,7 @@ class RReLU(Cell):
[ 2. 0. ]]
"""
def __init__(self, lower=0.125, upper=float(1. / 3)):
def __init__(self, lower=1/8, upper=1/3):
super(RReLU, self).__init__()
validator.check_value_type('upper', upper, [float, int], self.cls_name)
validator.check_value_type('lower', lower, [float, int], self.cls_name)