!19189 [doc]rm space after ‘:’

Merge pull request !19189 from chenfei_mindspore/code_docs_api_clean
This commit is contained in:
i-robot 2021-07-01 02:35:58 +00:00 committed by Gitee
commit 2613322966
2 changed files with 1 additions and 3 deletions

View File

@ -132,7 +132,7 @@ def build_train_network(network, optimizer, loss_fn=None, level='O0', **kwargs):
O2 is recommended on GPU, O3 is recommended on Ascend.Property of `keep_batchnorm_fp32` , `cast_model_type`
and `loss_scale_manager` determined by `level` setting may be overwritten by settings in `kwargs` .
cast_model_type (:class: `mindspore.dtype` ): Supports `mstype.float16` or `mstype.float32` . If set, the
cast_model_type (:class:`mindspore.dtype`): Supports `mstype.float16` or `mstype.float32` . If set, the
network will be casted to `cast_model_type` ( `mstype.float16` or `mstype.float32` ), but not to be casted
to the type determined by `level` setting.
keep_batchnorm_fp32 (bool): Keep Batchnorm run in `float32` when the network is set to cast to `float16` . If

View File

@ -76,8 +76,6 @@ class Model:
elements, including the positions of loss value, predicted value and label. The loss
value would be passed to the `Loss` metric, the predicted value and label would be passed
to other metric. Default: None.
Args:
amp_level (str): Option for argument `level` in `mindspore.amp.build_train_network` , level for mixed
precision training. Supports ["O0", "O2", "O3", "auto"]. Default: "O0".