!49440 modify description of apis

Merge pull request !49440 from ZhidanLiu/code_docs_master
This commit is contained in:
i-robot 2023-02-28 02:27:10 +00:00 committed by Gitee
commit 3fba4e1477
No known key found for this signature in database
GPG Key ID: 173E9B9CA92EEF8F
9 changed files with 21 additions and 24 deletions

View File

@ -3,7 +3,8 @@ mindspore.nn.ChannelShuffle
.. py:class:: mindspore.nn.ChannelShuffle(groups)
将shape的为 :math:`(*, C, H, W)` 的Tensor的通道划分成 :math:`g` 组,并将其以 :math:`(*, C \frac g, g, H, W)` 的shape重新排列, 同时保持Tensor原有的shape。
将shape为 :math:`(*, C, H, W)` 的Tensor的通道划分成 :math:`g`得到shape为 :math:`(*, C \frac g, g, H, W)` 的Tensor
并沿着 :math:`C`:math:`\frac g`:math:`g` 对应轴进行转置将Tensor还原成原有的shape。
参数:
- **groups** (int) - 划分通道的组数。取值范围是 :math:`(0, \inf)` 。在上述公式中表示为 :math:`g`

View File

@ -13,7 +13,7 @@ mindspore.nn.GLU
这里 :math:`\sigma` 为sigmoid函数:math:`*` 为矩阵的基本乘。
参数:
- **axis** (int) - 指定分割轴。数据类型为整型,默认值:-1。
- **axis** (int) - 指定分割轴。数据类型为整型,默认值:-1输入x的最后一维
输入:
- **x** (Tensor) - Tensor的shape为 :math:`(\ast_1, N, \ast_2)``*` 表示任意数量的维度。

View File

@ -3,9 +3,8 @@ mindspore.nn.MaxUnpool1d
.. py:class:: mindspore.nn.MaxUnpool1d(kernel_size, stride=None, padding=0)
`MaxPool1d` 的部分逆过程。 `MaxPool1d` 不是完全可逆的,因为非最大值丢失。
`MaxUnpool1d``MaxPool1d` 的输出为输入,包括最大值的索引。在计算 `MaxPool1d` 部分逆的过程中,非最大值设置为零。
支持的输入数据格式为 :math:`(N, C, H_{in})`:math:`(C, H_{in})` ,输出数据的个格式为 :math:`(N, C, H_{out})`
计算 :class:`mindspore.nn.MaxPool1d` 的逆过程。
`MaxUnPool1d` 保留最大值并将所有非最大值置0。支持的输入数据格式为 :math:`(N, C, H_{in})`:math:`(C, H_{in})` ,输出数据的格式为 :math:`(N, C, H_{out})`
:math:`(C, H_{out})` ,计算公式如下:
.. math::

View File

@ -3,9 +3,9 @@ mindspore.nn.Softmax2d
.. py:class:: mindspore.nn.Softmax2d()
将 SoftMax 应用于每个空间位置的特征
应用于2D特征数据的Softmax函数
当给定shape :math:`(C, H, W)` 的Tensor时它将 `Softmax` 应用于每个位置 :math:`(c, h, w)`
`Softmax` 应用于具有shape :math:`(C, H, W)` 的输入Tensor的每个位置 :math:`(c, h, w)`
输入:
- **x** (Tensor) - Tensor的shape :math:`(N, C_{in}, H_{in}, W_{in})` 或者 :math:`(C_{in}, H_{in}, W_{in})`

View File

@ -13,7 +13,7 @@ mindspore.ops.glu
参数:
- **x** (Tensor) - 被分Tensor数据类型为number.Number, shape为 :math:`(\ast_1, N, \ast_2)` ,其中 `*` 为任意额外维度。
- **axis** (int可选) - 指定分割轴。数据类型为整型,默认值:-1。
- **axis** (int可选) - 指定分割轴。数据类型为整型,默认值:-1输入x的最后一维
返回:
Tensor数据类型与输入 `x` 相同shape为 :math:`(\ast_1, M, \ast_2)`,其中 :math:`M=N/2`

View File

@ -164,10 +164,9 @@ class Softmin(Cell):
class Softmax2d(Cell):
r"""
Applies SoftMax over features to each spatial location.
Softmax function applied to 2D features data.
When given a Tensor with shape of :math:`(C, H, W)` , it will
apply `Softmax` to each location :math:`(c, h, w)`.
Applies `Softmax` to each location :math:`(c, h, w)` with an input Tensor of shape :math:`(C, H, W)` .
Inputs:
- **x** (Tensor) - Tensor of shape :math:`(N, C_{in}, H_{in}, W_{in})` or :math:`(C_{in}, H_{in}, W_{in})`.
@ -1417,7 +1416,7 @@ class Mish(Cell):
class GLU(Cell):
r"""
Applies the gated linear unit function.
The gated linear unit function.
.. math::
{GLU}(a, b)= a \otimes \sigma(b)
@ -1427,7 +1426,7 @@ class GLU(Cell):
Here :math:`\sigma` is the sigmoid function, and :math:`*` is the Hadamard product.
Args:
axis (int): the dimension on which to split the input. Default: -1
axis (int): the axis to split the input. Default: -1, the last axis in `x`.
Inputs:
- **x** (Tensor) - :math:`(\ast_1, N, \ast_2)` where `*` means, any number of additional dimensions.
@ -1443,7 +1442,7 @@ class GLU(Cell):
>>> input = Tensor([[0.1,0.2,0.3,0.4],[0.5,0.6,0.7,0.8]])
>>> output = m(input)
>>> print(output)
[[0.05744425 0.11973753
[[0.05744425 0.11973753]
[0.33409387 0.41398472]]
"""

View File

@ -21,9 +21,9 @@ __all__ = ['ChannelShuffle']
class ChannelShuffle(Cell):
r"""
Divide the channels in a tensor of shape :math:`(*, C , H, W)`
into g groups and rearrange them as :math:`(*, C \frac g, g, H, W)`,
while keeping the original tensor shape.
Divide the channels of Tensor whose shape is :math:`(*, C , H, W)` into g groups to obtain a Tensor with
shape :math:(*, C frac g, g, H, W), and transpose along the corresponding axis of :math:`C`, :math:`frac g` and
:math:`g` to restore Tensor to the original shape.
Args:
groups (int): Number of groups to divide channels in. Refer to :math:`g`.

View File

@ -1390,14 +1390,12 @@ class FractionalMaxPool3d(Cell):
class MaxUnpool1d(Cell):
r"""
Computes a partial inverse of MaxPool1d.
Computes the inverse of :class:`mindspore.nn.MaxPool1d`.
MaxPool1d is not fully invertible, since the non-maximal values are lost.
MaxUnpool2d keeps the maximal value and set all position of non-maximal values to zero. Typically the input
is of shape :math:`(N, C, H_{in})` or :math:`(C, H_{in})`, and the output is of shape
:math:`(N, C, H_{out})` or :math:`(C, H_{out})`. The operation is as follows.
MaxUnpool1d takes in as input the output of MaxPool1d including the indices of the maximal values
and computes a partial inverse in which all non-maximal values are set to zero. Typically the input
is of shape :math:`(N, C, H_{in})` or :math:`(C, H_{in})`, and the output is of shape :math:`(N, C, H_{out})`
or :math:`(C, H_{out})`. The operation is as follows.
.. math::
\begin{array}{ll} \\

View File

@ -5096,7 +5096,7 @@ def glu(x, axis=-1):
Args:
x (Tensor): Tensor to be splited. Its dtype is number.Number, and shape is :math:`(\ast_1, N, \ast_2)`
where `*` means, any number of additional dimensions.
axis (int, optional): the dimension on which to split the input. It must be int. Default: -1.
axis (int, optional): the axis to split the input. It must be int. Default: -1, the last axis of `x`.
Returns:
Tensor, the same dtype as the `x`, with the shape :math:`(\ast_1, M, \ast_2)` where :math:`M=N/2`.