update the documentation of Mish operator.

This commit is contained in:
wangshuide2020 2021-03-22 20:36:58 +08:00
parent f0016f5574
commit 6b0948c62b
1 changed files with 5 additions and 2 deletions

View File

@ -372,7 +372,7 @@ class ReLU(PrimitiveWithCheck):
class Mish(PrimitiveWithInfer):
r"""
Computes MISH of input tensors element-wise.
Computes MISH(A Self Regularized Non-Monotonic Neural Activation Function) of input tensors element-wise.
The function is shown as follows:
@ -380,6 +380,9 @@ class Mish(PrimitiveWithInfer):
\text{output} = x * \tan(\log(1 + \exp(\text{x})))
See more details in `A Self Regularized Non-Monotonic Neural Activation Function
<https://arxiv.org/abs/1908.08681>`_.
Inputs:
- **x** (Tensor) - The input tensor. Only support float16 and float32.
@ -390,7 +393,7 @@ class Mish(PrimitiveWithInfer):
``Ascend``
Raise:
TypeError: If num_features data type not float16 and float32 Tensor.
TypeError: If dtype of `x` is neither float16 nor float32.
Examples:
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)