edit readme

This commit is contained in:
yoonlee666 2020-10-28 10:36:17 +08:00
parent 5b28016b4d
commit acb44b91b9
2 changed files with 10 additions and 8 deletions

View File

@ -450,7 +450,7 @@ The result will be as follows:
| Model Version | BERT_base | BERT_base |
| Resource | Ascend 910, cpu:2.60GHz 192cores, memory:755G | NV SMX2 V100-32G |
| uploaded Date | 08/22/2020 | 05/06/2020 |
| MindSpore Version | 0.6.0 | 0.3.0 |
| MindSpore Version | 1.0.0 | 1.0.0 |
| Dataset | cn-wiki-128(4000w) | ImageNet |
| Training Parameters | src/config.py | src/config.py |
| Optimizer | Lamb | Momentum |
@ -463,14 +463,14 @@ The result will be as follows:
| Total time | 73h | |
| Params (M) | 110M | |
| Checkpoint for Fine tuning | 1.2G(.ckpt file) | |
| Scripts | [BERT_base](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/nlp/bert) | |
| Parameters | Ascend | GPU |
| -------------------------- | ---------------------------------------------------------- | ------------------------- |
| Model Version | BERT_NEZHA | BERT_NEZHA |
| Resource | Ascend 910, cpu:2.60GHz 192cores, memory:755G | NV SMX2 V100-32G |
| uploaded Date | 08/20/2020 | 05/06/2020 |
| MindSpore Version | 0.6.0 | 0.3.0 |
| MindSpore Version | 1.0.0 | 1.0.0 |
| Dataset | cn-wiki-128(4000w) | ImageNet |
| Training Parameters | src/config.py | src/config.py |
| Optimizer | Lamb | Momentum |
@ -483,6 +483,7 @@ The result will be as follows:
| Total time | 200h | |
| Params (M) | 340M | |
| Checkpoint for Fine tuning | 3.2G(.ckpt file) | |
| Scripts | [BERT_NEZHA](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/nlp/bert) | |
#### Inference Performance
@ -491,7 +492,7 @@ The result will be as follows:
| Model Version | | |
| Resource | Ascend 910 | NV SMX2 V100-32G |
| uploaded Date | 08/22/2020 | 05/22/2020 |
| MindSpore Version | 0.6.0 | 0.2.0 |
| MindSpore Version | 1.0.0 | 1.0.0 |
| Dataset | cola, 1.2W | ImageNet, 1.2W |
| batch_size | 32(1P) | 130(8P) |
| Accuracy | 0.588986 | ACC1[72.07%] ACC5[90.90%] |

View File

@ -354,7 +354,7 @@ The best acc is 0.891176
| Model Version | TinyBERT | TinyBERT |
| Resource | Ascend 910, cpu:2.60GHz 192cores, memory:755G | NV SMX2 V100-32G, cpu:2.10GHz 64cores, memory:251G |
| uploaded Date | 08/20/2020 | 08/24/2020 |
| MindSpore Version | 0.6.0 | 0.7.0 |
| MindSpore Version | 1.0.0 | 1.0.0 |
| Dataset | cn-wiki-128 | cn-wiki-128 |
| Training Parameters | src/gd_config.py | src/gd_config.py |
| Optimizer | AdamWeightDecay | AdamWeightDecay |
@ -365,6 +365,7 @@ The best acc is 0.891176
| Total time | 17.3h(3poch, 8p) | 48h(3poch, 8p) |
| Params (M) | 15M | 15M |
| Checkpoint for task distill| 74M(.ckpt file) | 74M(.ckpt file) |
| Scripts | [TinyBERT](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/nlp/tinybert) | |
#### Inference Performance
@ -373,7 +374,7 @@ The best acc is 0.891176
| Model Version | | |
| Resource | Ascend 910 | NV SMX2 V100-32G |
| uploaded Date | 08/20/2020 | 08/24/2020 |
| MindSpore Version | 0.6.0 | 0.7.0 |
| MindSpore Version | 1.0.0 | 1.0.0 |
| Dataset | SST-2, | SST-2 |
| batch_size | 32 | 32 |
| Accuracy | 0.902777 | 0.9086 |