[OPS] add funtional api docs for hardsigmoid

This commit is contained in:
yangruoqi713 2023-03-02 15:15:16 +08:00
parent 7e18f80e03
commit 9af7359ea9
6 changed files with 30 additions and 39 deletions

View File

@ -88,6 +88,7 @@ MindSpore中 `mindspore.ops` 接口与上一版本相比,新增、删除和支
mindspore.ops.glu
mindspore.ops.gumbel_softmax
mindspore.ops.hardshrink
mindspore.ops.hardsigmoid
mindspore.ops.hardswish
mindspore.ops.hardtanh
mindspore.ops.leaky_relu

View File

@ -5,20 +5,4 @@ mindspore.ops.HSigmoid
分段性逼近激活函数。
逐元素计算。输入为任意维度的Tensor。
HSigmoid定义为
.. math::
\text{hsigmoid}(x_{i}) = max(0, min(1, \frac{x_{i} + 3}{6})),
其中 :math:`x_i` 是输入Tensor的元素。
输入:
- **input_x** (Tensor) - 输入Tensor 其shape为 :math:`(N,*)` ,其中 :math:`*` 表示任意数量的附加维度。
输出:
Tensor数据类型和shape与 `input_x` 相同。
异常:
- **TypeError** - 如果 `input_x` 不是Tensor。
更多参考详见 :func:`mindspore.ops.hardsigmoid`

View File

@ -0,0 +1,23 @@
mindspore.ops.hardsigmoid
=========================
.. py:function:: mindspore.ops.hardsigmoid(input_x)
Hard Sigmoid激活函数。按元素计算输出。
Hard Sigmoid定义为
.. math::
\text{hsigmoid}(x_{i}) = max(0, min(1, \frac{x_{i} + 3}{6})),
其中,:math:`x_i` 是输入Tensor的一个元素。
参数:
- **input_x** (Tensor) - Hard Sigmoid的输入任意维度的Tensor数据类型为float16、float32或float64。
返回:
Tensorshape和数据类型与输入 `input_x` 相同。
异常:
- **TypeError** - `input_x` 不是Tensor。
- **TypeError** - `input_x` 的dtype不是float16、float32或float64。

View File

@ -88,6 +88,7 @@ Activation Functions
mindspore.ops.glu
mindspore.ops.gumbel_softmax
mindspore.ops.hardshrink
mindspore.ops.hardsigmoid
mindspore.ops.hardswish
mindspore.ops.hardtanh
mindspore.ops.leaky_relu

View File

@ -4457,8 +4457,8 @@ def hardsigmoid(input_x):
where :math:`x_i` is an element of the input Tensor.
Inputs:
- **input_x** (Tensor) - Tensor of shape :math:`(*)`, where :math:`*` means any number of
Args:
input_x (Tensor): Tensor of shape :math:`(*)`, where :math:`*` means any number of
dimensions, with float16, float32 or float64 data type.
Outputs:
@ -4469,7 +4469,7 @@ def hardsigmoid(input_x):
TypeError: If dtype of `input_x` is not float16, float32 or float64.
Supported Platforms:
``Ascend`` ``CPU``
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> x = Tensor(np.array([ -3.5, 0, 4.3]), mindspore.float32)

View File

@ -882,25 +882,7 @@ class HSigmoid(Primitive):
r"""
Hard sigmoid activation function.
Applies hard sigmoid activation element-wise. The input is a Tensor with any valid shape.
Hard sigmoid is defined as:
.. math::
\text{hsigmoid}(x_{i}) = max(0, min(1, \frac{x_{i} + 3}{6})),
where :math:`x_i` is an element of the input Tensor.
Inputs:
- **input_x** (Tensor) - Tensor of shape :math:`(N, *)`, where :math:`*` means, any number of
additional dimensions.
Outputs:
Tensor, with the same type and shape as the `input_x`.
Raises:
TypeError: If `input_x` is not a Tensor.
Refer to :func:`mindspore.ops.hardsigmoid` for more details.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``