!4732 Modified describtion about reduction of SoftmaxCrossEntropyWithLogits.

Merge pull request !4732 from liuxiao93/fix-SoftmaxCrossEntropyWithLogits-attr-reduction
This commit is contained in:
mindspore-ci-bot 2020-08-20 09:10:56 +08:00 committed by Gitee
commit 2e936a27a6
1 changed files with 3 additions and 3 deletions

View File

@ -215,8 +215,8 @@ class SoftmaxCrossEntropyWithLogits(_Loss):
Args: Args:
is_grad (bool): Specifies whether calculate grad only. Default: True. is_grad (bool): Specifies whether calculate grad only. Default: True.
sparse (bool): Specifies whether labels use sparse format or not. Default: False. sparse (bool): Specifies whether labels use sparse format or not. Default: False.
reduction (Union[str, None]): Type of reduction to be applied to loss. Support 'sum' and 'mean'. If None, reduction (str): Type of reduction to be applied to loss. The optional values are "mean", "sum", and "none".
do not perform reduction. Default: None. If "none", do not perform reduction. Default: "none".
smooth_factor (float): Label smoothing factor. It is a optional input which should be in range [0, 1]. smooth_factor (float): Label smoothing factor. It is a optional input which should be in range [0, 1].
Default: 0. Default: 0.
num_classes (int): The number of classes in the task. It is a optional input Default: 2. num_classes (int): The number of classes in the task. It is a optional input Default: 2.
@ -240,7 +240,7 @@ class SoftmaxCrossEntropyWithLogits(_Loss):
def __init__(self, def __init__(self,
is_grad=True, is_grad=True,
sparse=False, sparse=False,
reduction=None, reduction='none',
smooth_factor=0, smooth_factor=0,
num_classes=2): num_classes=2):
super(SoftmaxCrossEntropyWithLogits, self).__init__(reduction) super(SoftmaxCrossEntropyWithLogits, self).__init__(reduction)