add func sigmoid and modify docs

This commit is contained in:
huodagu 2022-12-28 15:54:40 +08:00
parent 7b46b59880
commit bf29fececa
5 changed files with 7 additions and 4 deletions

View File

@ -90,6 +90,7 @@ mindspore.ops
mindspore.ops.log_softmax
mindspore.ops.mish
mindspore.ops.selu
mindspore.ops.sigmoid
mindspore.ops.soft_shrink
mindspore.ops.softmax
mindspore.ops.softsign

View File

@ -24,7 +24,7 @@ mindspore.ops.SparseApplyAdagradV2
输入:
- **var** (Parameter) - 要更新的变量。为任意维度其数据类型为float16或float32。
- **accum** (Parameter) - 要更新的累积。shape和数据类型必须与 `var` 相同。
- **grad** (Tensor) - 梯度为一个Tensor。shape和数据类型必须与 `var` 相同,且需要满足 :math:`grad.shape[1:] = var.shape[1:] if var.shape > 1`。
- **grad** (Tensor) - 梯度为一个Tensor。shape和数据类型必须与 `var` 相同,且需要满足`var.shape > 1` :math:`grad.shape[1:] = var.shape[1:]`。
- **indices** (Tensor) - `var``accum` 第一维度的索引向量数据类型为int32且需要保证 :math:`indices.shape[0] = grad.shape[0]`
输出:

View File

@ -19,7 +19,7 @@ mindspore.ops.SparseApplyFtrlV2
- **var** (Parameter) - 要更新的权重。数据类型必须为float16或float32。shape为 :math:`(N, *)` ,其中 :math:`*` 表示任意数量的附加维度。
- **accum** (Parameter) - 要更新的累加值shape和数据类型必须与 `var` 相同。
- **linear** (Parameter) - 要更新的线性系数shape和数据类型必须与 `var` 相同。
- **grad** (Tensor) - 梯度为一个Tensor。数据类型必须与 `var` 相同,且需要满足 :math:`grad.shape[1:] = var.shape[1:] if var.shape > 1`。
- **grad** (Tensor) - 梯度为一个Tensor。数据类型必须与 `var` 相同,且需要满足`var.shape > 1` :math:`grad.shape[1:] = var.shape[1:]`。
- **indices** (Tensor) - `var``accum` 第一维度的索引向量数据类型为int32且需要保证 :math:`indices.shape[0] = grad.shape[0]`
输出:

View File

@ -91,6 +91,7 @@ Activation Functions
mindspore.ops.log_softmax
mindspore.ops.mish
mindspore.ops.selu
mindspore.ops.sigmoid
mindspore.ops.softsign
mindspore.ops.soft_shrink
mindspore.ops.softmax

View File

@ -5965,7 +5965,7 @@ class SparseApplyAdagradV2(Primitive):
The shape is :math:`(N, *)` where :math:`*` means, any number of additional dimensions.
- **accum** (Parameter) - Accumulation to be updated. The shape and data type must be the same as `var`.
- **grad** (Tensor) - Gradients has the same data type as `var` and
grad.shape[1:] = var.shape[1:] if var.shape > 1.
:math:`grad.shape[1:] = var.shape[1:]` if var.shape > 1.
- **indices** (Tensor) - A vector of indices into the first dimension of `var` and `accum`.
The type must be int32 and indices.shape[0] = grad.shape[0].
@ -6820,7 +6820,8 @@ class SparseApplyFtrlV2(PrimitiveWithInfer):
The shape is :math:`(N, *)` where :math:`*` means, any number of additional dimensions.
- **accum** (Parameter) - The accumulation to be updated, must be same data type and shape as `var`.
- **linear** (Parameter) - the linear coefficient to be updated, must be same data type and shape as `var`.
- **grad** (Tensor) - A tensor of the same type as `var` and grad.shape[1:] = var.shape[1:] if var.shape > 1.
- **grad** (Tensor) - A tensor of the same type as `var` and
:math:`grad.shape[1:] = var.shape[1:]` if var.shape > 1.
- **indices** (Tensor) - A vector of indices in the first dimension of `var` and `accum`.
The type must be int32 and indices.shape[0] = grad.shape[0].