!3422 modify annotation: wegith_decay modify weight_decay

Merge pull request !3422 from lilei/modify_annotation_0724
This commit is contained in:
mindspore-ci-bot 2020-07-24 20:06:50 +08:00 committed by Gitee
commit 8d0be0ae1a
2 changed files with 2 additions and 2 deletions

View File

@ -144,7 +144,7 @@ class FTRL(Optimizer):
l2 (float): l2 regularization strength, must be greater than or equal to zero. Default: 0.0.
use_locking (bool): If True use locks for update operation. Default: False.
loss_scale (float): Value for the loss scale. It should be equal to or greater than 1.0. Default: 1.0.
wegith_decay (float): Weight decay value to multiply weight, should be in range [0.0, 1.0]. Default: 0.0.
weight_decay (float): Weight decay value to multiply weight, should be in range [0.0, 1.0]. Default: 0.0.
Inputs:
- **grads** (tuple[Tensor]) - The gradients of `params` in optimizer, the shape is as same as the `params`

View File

@ -99,7 +99,7 @@ class ProximalAdagrad(Optimizer):
l2 (float): l2 regularization strength, must be greater than or equal to zero. Default: 0.0.
use_locking (bool): If True use locks for update operation. Default: False.
loss_scale (float): Value for the loss scale. It should be not less than 1.0. Default: 1.0.
wegith_decay (float): Weight decay value to multiply weight, should be in range [0.0, 1.0]. Default: 0.0.
weight_decay (float): Weight decay value to multiply weight, should be in range [0.0, 1.0]. Default: 0.0.
Inputs:
- **grads** (tuple[Tensor]) - The gradients of `params` in optimizer, the shape is as same as the `params`