!5507 vgg16 readme update

Merge pull request !5507 from caojian05/ms_vgg_readme_update
This commit is contained in:
mindspore-ci-bot 2020-08-30 20:19:13 +08:00 committed by Gitee
commit 8088d1916a
1 changed files with 2 additions and 1 deletions

View File

@ -79,6 +79,7 @@ here basic modules mainly include basic operation like: **3×3 conv** and **2×
## Mixed Precision
The [mixed precision](https://www.mindspore.cn/tutorial/zh-CN/master/advanced_use/mixed_precision.html) training method accelerates the deep learning neural network training process by using both the single-precision and half-precision data formats, and maintains the network precision achieved by the single-precision training at the same time. Mixed precision training can accelerate the computation process, reduce memory usage, and enable a larger model or batch size to be trained on specific hardware.
For FP16 operators, if the input data type is FP32, the backend of MindSpore will automatically handle it with reduced precision. Users could check the reduced-precision operators by enabling INFO log and then searching reduce precision.
@ -370,4 +371,4 @@ after allreduce eval: top5_correct=45582, tot=50000, acc=91.16%
In dataset.py, we set the seed inside “create_dataset" function. We also use random seed in train.py.
# [ModelZoo Homepage](#contents)
Please check the official [homepage](https://gitee.com/mindspore/mindspore/tree/master/model_zoo).
Please check the official [homepage](https://gitee.com/mindspore/mindspore/tree/master/model_zoo).