!5504 modify README.md

Merge pull request !5504 from wukesong/modify_read
This commit is contained in:
mindspore-ci-bot 2020-08-29 19:44:34 +08:00 committed by Gitee
commit 037b8e9a96
4 changed files with 9 additions and 10 deletions

View File

@ -71,8 +71,7 @@ sh run_standalone_eval_ascend.sh [DATA_PATH] [CKPT_NAME]
## [Script and Sample Code](#contents)
```
├── model_zoo
├── README.md // descriptions about all the models
├── cv
├── alexnet
├── README.md // descriptions about alexnet
├── requirements.txt // package needed
@ -116,8 +115,8 @@ sh run_standalone_train_ascend.sh cifar-10-batches-bin ckpt
After training, the loss value will be achieved as follows:
# grep "loss is " train.log
```
# grep "loss is " train.log
epoch: 1 step: 1, loss is 2.2791853
...
epoch: 1 step: 1536, loss is 1.9366643
@ -171,7 +170,7 @@ You can view the results through the file "log.txt". The accuracy of the test da
# [Description of Random Situation](#contents)
In dataset.py, we set the seed inside “create_dataset" function.
In dataset.py, we set the seed inside ```create_dataset``` function.
# [ModelZoo Homepage](#contents)
Please check the official [homepage](https://gitee.com/mindspore/mindspore/tree/master/model_zoo).

View File

@ -77,8 +77,7 @@ sh run_standalone_eval_ascend.sh [DATA_PATH] [CKPT_NAME]
## [Script and Sample Code](#contents)
```
├── model_zoo
├── README.md // descriptions about all the models
├── cv
├── lenet
├── README.md // descriptions about lenet
├── requirements.txt // package needed
@ -181,7 +180,7 @@ You can view the results through the file "log.txt". The accuracy of the test da
# [Description of Random Situation](#contents)
In dataset.py, we set the seed inside “create_dataset" function.
In dataset.py, we set the seed inside ```create_dataset``` function.
# [ModelZoo Homepage](#contents)
Please check the official [homepage](https://gitee.com/mindspore/mindspore/tree/master/model_zoo).

View File

@ -175,7 +175,7 @@ result: {'acc': 0.71976314102564111} ckpt=/path/to/checkpoint/mobilenet-200_625.
| Parameters | | | |
| -------------------------- | ----------------------------- | ------------------------- | -------------------- |
| Model Version | V1 | | |
| Resource | Huawei 910 | NV SMX2 V100-32G | Huawei 310 |
| Resource | Ascend 910 | NV SMX2 V100-32G | Ascend 310 |
| uploaded Date | 05/06/2020 | 05/22/2020 | |
| MindSpore Version | 0.2.0 | 0.2.0 | 0.2.0 |
| Dataset | ImageNet, 1.2W | ImageNet, 1.2W | ImageNet, 1.2W |

View File

@ -47,7 +47,8 @@ Dataset used: [imagenet](http://www.image-net.org/)
## [Mixed Precision](#contents)
The [mixed precision](https://www.mindspore.cn/tutorial/zh-CN/master/advanced_use/mixed_precision.html) training method accelerates the deep learning neural network training process by using both the single-precision and half-precision data formats, and maintains the network precision achieved by the single-precision training at the same time. Mixed precision training can accelerate the computation process, reduce memory usage, and enable a larger model or batch size to be trained on specific hardware.
The [mixed precision](https://www.mindspore.cn/tutorial/zh-CN/master/advanced_use/mixed_precision.html) training method accelerates the deep learning neural network training process by using both the single-precision and half-precision data formats, and maintains the network precision achieved by the single-precision training at the same time. Mixed precision training can accelerate the computation process, reduce memory usage, and enable a larger model or batch size to be trained on specific hardware.
For FP16 operators, if the input data type is FP32, the backend of MindSpore will automatically handle it with reduced precision. Users could check the reduced-precision operators by enabling INFO log and then searching reduce precision.
# [Environment Requirements](#contents)
@ -228,7 +229,7 @@ acc=93.88%(TOP5)
| Parameters | | | |
| -------------------------- | ----------------------------- | ------------------------- | -------------------- |
| Resource | Huawei 910 | NV SMX2 V100-32G | Huawei 310 |
| Resource | Ascend 910 | NV SMX2 V100-32G | Ascend 310 |
| uploaded Date | 06/30/2020 | 07/23/2020 | 07/23/2020 |
| MindSpore Version | 0.5.0 | 0.6.0 | 0.6.0 |
| Dataset | ImageNet, 1.2W | ImageNet, 1.2W | ImageNet, 1.2W |