!21559 Add faq in modelzoo
Merge pull request !21559 from chenhaozhe/code_docs_add_modelzoo_fqa
This commit is contained in:
commit
74eefbd390
|
@ -113,3 +113,9 @@ MindSpore is Apache 2.0 licensed. Please see the LICENSE file.
|
|||
## License
|
||||
|
||||
[Apache License 2.0](https://gitee.com/mindspore/mindspore/blob/master/LICENSE)
|
||||
|
||||
## FAQ
|
||||
|
||||
- **Q: How to resolve the lack of memory while using `PYNATIVE_MODE` with errors such as *Failed to alloc memory pool memory*?**
|
||||
|
||||
**A**: `PYNATIVE_MODE` usually requires more memory than `GRAPH_MODE`, especially in training process which have to deal with back propagation. You could try using smaller batch size.
|
||||
|
|
|
@ -113,3 +113,9 @@ MindSpore已获得Apache 2.0许可,请参见LICENSE文件。
|
|||
## 许可证
|
||||
|
||||
[Apache 2.0许可证](https://gitee.com/mindspore/mindspore/blob/master/LICENSE)
|
||||
|
||||
## FAQ
|
||||
|
||||
- **Q: 使用`PYNATIVE_MODE`运行模型出现错误内存不足,例如*Failed to alloc memory pool memory*, 该怎么处理?**
|
||||
|
||||
**A**: `PYNATIVE_MODE`通常比`GRAPH_MODE`使用更多内存,尤其是在需要进行反向传播计算的训练图中,你可以尝试使用一些更小的batch size.
|
||||
|
|
|
@ -789,6 +789,8 @@ Please check the official [homepage](https://gitee.com/mindspore/mindspore/tree/
|
|||
|
||||
# FAQ
|
||||
|
||||
Refer to the [ModelZoo FAQ](https://gitee.com/mindspore/mindspore/tree/master/model_zoo#FAQ) for some common question.
|
||||
|
||||
- **Q: How to resolve the continually overflow?**
|
||||
|
||||
**A**: Continually overflow is usually caused by using too high learning rate.
|
||||
|
|
|
@ -747,6 +747,8 @@ run_pretrain.py中设置了随机种子,确保分布式训练中每个节点
|
|||
|
||||
# FAQ
|
||||
|
||||
优先参考[ModelZoo FAQ](https://gitee.com/mindspore/mindspore/tree/master/model_zoo#FAQ)来查找一些常见的公共问题。
|
||||
|
||||
- **Q: 运行过程中发生持续溢出怎么办?**
|
||||
**A**: 持续溢出通常是因为使用了较高的学习率导致训练不收敛。可以考虑修改yaml配置文件中的参数,调低`learning_rate`来降低初始学习率或提高`power`加速学习率衰减。
|
||||
|
||||
|
|
Loading…
Reference in New Issue