From ee47bd13ce4fdf901becf344660f6a24f1f5deb3 Mon Sep 17 00:00:00 2001 From: wukesong Date: Tue, 27 Oct 2020 21:01:46 +0800 Subject: [PATCH] modify readme --- model_zoo/official/cv/alexnet/README.md | 10 ++++++---- model_zoo/official/cv/lenet/README.md | 10 ++++++---- 2 files changed, 12 insertions(+), 8 deletions(-) diff --git a/model_zoo/official/cv/alexnet/README.md b/model_zoo/official/cv/alexnet/README.md index d682836985a..0e27ed8ec0a 100644 --- a/model_zoo/official/cv/alexnet/README.md +++ b/model_zoo/official/cv/alexnet/README.md @@ -30,6 +30,8 @@ AlexNet composition consists of 5 convolutional layers and 3 fully connected lay # [Dataset](#contents) +Note that you can run the scripts based on the dataset mentioned in original paper or widely used in relevant domain/network architecture. In the following sections, we will introduce how to run the scripts using the related dataset below. + Dataset used: [CIFAR-10]() - Dataset size:175M,60,000 32*32 colorful images in 10 classes @@ -195,15 +197,15 @@ Before running the command below, please check the checkpoint path used for eval | -------------------------- | ------------------------------------------------------------| -------------------------------------------------| | Resource | Ascend 910; CPU 2.60GHz, 192cores; Memory, 755G | NV SMX2 V100-32G | | uploaded Date | 06/09/2020 (month/day/year) | 17/09/2020 (month/day/year) | -| MindSpore Version | 0.5.0-beta | 0.7.0-beta | +| MindSpore Version | 1.0.0 | 0.7.0-beta | | Dataset | CIFAR-10 | CIFAR-10 | | Training Parameters | epoch=30, steps=1562, batch_size = 32, lr=0.002 | epoch=30, steps=1562, batch_size = 32, lr=0.002 | | Optimizer | Momentum | Momentum | | Loss Function | Softmax Cross Entropy | Softmax Cross Entropy | | outputs | probability | probability | -| Loss | 0.0016 | 0.01 | -| Speed | 21 ms/step | 16.8 ms/step | -| Total time | 17 mins | 14 mins | +| Loss | 0.08 | 0.01 | +| Speed | 7.3 ms/step | 16.8 ms/step | +| Total time | 6 mins | 14 mins | | Checkpoint for Fine tuning | 445M (.ckpt file) | 445M (.ckpt file) | | Scripts | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/alexnet | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/alexnet | diff --git a/model_zoo/official/cv/lenet/README.md b/model_zoo/official/cv/lenet/README.md index a3f12d75f0e..3ad0d844367 100644 --- a/model_zoo/official/cv/lenet/README.md +++ b/model_zoo/official/cv/lenet/README.md @@ -30,6 +30,8 @@ LeNet is very simple, which contains 5 layers. The layer composition consists of # [Dataset](#contents) +Note that you can run the scripts based on the dataset mentioned in original paper or widely used in relevant domain/network architecture. In the following sections, we will introduce how to run the scripts using the related dataset below. + Dataset used: [MNIST]() - Dataset size:52.4M,60,000 28*28 in 10 classes @@ -165,16 +167,16 @@ You can view the results through the file "log.txt". The accuracy of the test da | Parameters | LeNet | | -------------------------- | ----------------------------------------------------------- | | Resource | Ascend 910 ;CPU 2.60GHz,192cores;Memory,755G | -| uploaded Date | 06/09/2020 (month/day/year) | -| MindSpore Version | 0.5.0-beta | +| uploaded Date | 09/16/2020 (month/day/year) | +| MindSpore Version | 1.0.0 | | Dataset | MNIST | | Training Parameters | epoch=10, steps=1875, batch_size = 32, lr=0.01 | | Optimizer | Momentum | | Loss Function | Softmax Cross Entropy | | outputs | probability | | Loss | 0.002 | -| Speed | 1.70 ms/step | -| Total time | 43.1s | | +| Speed | 1.071 ms/step | +| Total time | 32.1s | | | Checkpoint for Fine tuning | 482k (.ckpt file) | | Scripts | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/lenet |