fix README link
This commit is contained in:
parent
5e5489d59f
commit
289f856955
|
@ -84,7 +84,7 @@ other datasets need to use the same format as WiderFace.
|
|||
- Hardware(Ascend)
|
||||
- Prepare hardware environment with Ascend processor. If you want to try Ascend, please send the [application form](https://obs-9be7.obs.cn-east-2.myhuaweicloud.com/file/other/Ascend%20Model%20Zoo%E4%BD%93%E9%AA%8C%E8%B5%84%E6%BA%90%E7%94%B3%E8%AF%B7%E8%A1%A8.docx) to ascend@huawei.com. Once approved, you can get the resources.
|
||||
- Framework
|
||||
- [MindSpore](https://cmc-szv.clouddragon.huawei.com/cmcversion/index/search?searchKey=Do-MindSpore%20V100R001C00B622)
|
||||
- [MindSpore](https://www.mindspore.cn/install/en)
|
||||
- For more information, please check the resources below:
|
||||
- [MindSpore tutorials](https://www.mindspore.cn/tutorial/training/en/master/index.html)
|
||||
- [MindSpore Python API](https://www.mindspore.cn/doc/api_python/en/master/index.html)
|
||||
|
|
|
@ -188,7 +188,7 @@ class 1 precision is 88.01%, recall is 82.77%
|
|||
| Loss | ~0.008 |
|
||||
| Accuracy (8p) | precision=0.8854, recall=0.8024 |
|
||||
| Total time (8p) | 4h |
|
||||
| Scripts | [deeptext script](https://gitee.com/mindspore/mindspore/tree/r1.1/mindspore/official/cv/deeptext) |
|
||||
| Scripts | [deeptext script](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/deeptext) |
|
||||
|
||||
#### Inference Performance
|
||||
|
||||
|
|
|
@ -197,7 +197,7 @@ Calculated!{"precision": 0.814796668299853, "recall": 0.8006740491092923, "hmean
|
|||
| Total time | 1pc: 75.48 h; 8pcs: 10.01 h |
|
||||
| Parameters (M) | 27.36 |
|
||||
| Checkpoint for Fine tuning | 109.44M (.ckpt file) |
|
||||
| Scripts | <https://gitee.com/mindspore/mindspore/tree/master/model_zoo/psenet> |
|
||||
| Scripts | <https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/psenet> |
|
||||
|
||||
### Inference Performance
|
||||
|
||||
|
|
|
@ -195,7 +195,7 @@ Calculated!{"precision": 0.8147966668299853,"recall":0.8006740491092923,"h
|
|||
| 总时间 | 1卡:75.48小时;4卡:18.87小时|
|
||||
| 参数(M) | 27.36 |
|
||||
| 微调检查点 | 109.44M (.ckpt file) |
|
||||
| 脚本 | <https://gitee.com/mindspore/mindspore/tree/master/model_zoo/psenet> |
|
||||
| 脚本 | <https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/psenet> |
|
||||
|
||||
### 推理性能
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
![](https://www.mindspore.cn/static/img/logo.a3e472c9.png)
|
||||
![](https://www.mindspore.cn/static/img/logo_black.6a5c850d.png)
|
||||
|
||||
<!-- TOC -->
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
![](https://www.mindspore.cn/static/img/logo.a3e472c9.png)
|
||||
![](https://www.mindspore.cn/static/img/logo_black.6a5c850d.png)
|
||||
|
||||
<!-- TOC -->
|
||||
|
||||
|
|
|
@ -47,7 +47,7 @@ BERT(Devlin等人,2018年)采用有屏蔽的语料丰富文本预训练Tra
|
|||
|
||||
受BERT、GPT及其他语言模型的启发,微软致力于在此基础上研究[掩式序列到序列(MASS)预训练语言生成](https://www.microsoft.com/en-us/research/uploads/prod/2019/06/MASS-paper-updated-002.pdf)。MASS的参数k很重要,用来控制屏蔽后的分片长度。BERT和GPT属于特例,k等于1或者句长。
|
||||
|
||||
[MASS介绍 — 序列对序列语言生成任务中性能优于BERT和GPT的预训练方法](https://www.microsoft.com/en-us/research/blog/introduction-mass-a-pre-training-method-thing-forts-bert-and-gpt-in-sequence-to-sequence-language-generate-tasks/)
|
||||
[MASS介绍 — 序列对序列语言生成任务中性能优于BERT和GPT的预训练方法](https://www.microsoft.com/en-us/research/blog/introducing-mass-a-pre-training-method-that-outperforms-bert-and-gpt-in-sequence-to-sequence-language-generation-tasks/)
|
||||
|
||||
[论文](https://www.microsoft.com/en-us/research/uploads/prod/2019/06/MASS-paper-updated-002.pdf): Song, Kaitao, Xu Tan, Tao Qin, Jianfeng Lu and Tie-Yan Liu.“MASS: Masked Sequence to Sequence Pre-training for Language Generation.”ICML (2019).
|
||||
|
||||
|
|
|
@ -655,4 +655,4 @@ The model has been validated on Ascend environment, not validated on CPU and GPU
|
|||
|
||||
# ModelZoo Homepage
|
||||
|
||||
[Link](https://gitee.com/mindspore/mindspore/tree/master/mindspore/model_zoo)
|
||||
[Link](https://gitee.com/mindspore/mindspore/tree/master/model_zoo)
|
||||
|
|
|
@ -192,7 +192,7 @@ Parameters for both training and evaluation can be set in config.py
|
|||
| Speed | 1pc: 160 samples/sec; |
|
||||
| Total time | 1pc: 20 mins; |
|
||||
| Checkpoint for Fine tuning | 198.73M(.ckpt file) |
|
||||
| Scripts | [music_auto_tagging script](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/audio/fcn-4) |
|
||||
| Scripts | [music_auto_tagging script](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/research/audio/fcn-4) |
|
||||
|
||||
## [ModelZoo Homepage](#contents)
|
||||
|
||||
|
|
|
@ -79,7 +79,7 @@ Dataset used: [COCO2017](https://cocodataset.org/)
|
|||
- Hardware(Ascend)
|
||||
- Prepare hardware environment with Ascend processor. If you want to try Ascend, please send the [application form](https://obs-9be7.obs.cn-east-2.myhuaweicloud.com/file/other/Ascend%20Model%20Zoo%E4%BD%93%E9%AA%8C%E8%B5%84%E6%BA%90%E7%94%B3%E8%AF%B7%E8%A1%A8.docx) to ascend@huawei.com. Once approved, you can get the resources.
|
||||
- Framework
|
||||
- [MindSpore](https://cmc-szv.clouddragon.huawei.com/cmcversion/index/search?searchKey=Do-MindSpore%20V100R001C00B622)
|
||||
- [MindSpore](https://www.mindspore.cn/install/en)
|
||||
- For more information, please check the resources below:
|
||||
- [MindSpore tutorials](https://www.mindspore.cn/tutorial/training/en/master/index.html)
|
||||
- [MindSpore Python API](https://www.mindspore.cn/doc/api_python/en/master/index.html)
|
||||
|
|
Loading…
Reference in New Issue