modify readme

This commit is contained in:
meixiaowei 2020-07-29 16:46:52 +08:00
parent 924a34acb8
commit e5b9776b86
1 changed files with 5 additions and 4 deletions

View File

@ -94,7 +94,8 @@ sh run_distribute_train.sh [MINDSPORE_HCCL_CONFIG_PATH] [PRETRAINED_MODEL]
sh run_standalone_train.sh [PRETRAINED_MODEL]
```
> About rank_table.json, you can refer to the [distributed training tutorial](https://www.mindspore.cn/tutorial/en/master/advanced_use/distributed_training.html).
> hccl.json which is specified by MINDSPORE_HCCL_CONFIG_PATH is needed when you are running a distribute task. You can generate it by using the [hccl_tools](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/utils/hccl_tools).
> As for PRETRAINED_MODELif not set, the model will be trained from the very beginning.Ready-made pretrained_models are not available now. Stay tuned.
#### Result
@ -118,10 +119,10 @@ epoch: 12 step: 7393 ,rpn_loss: 0.06482, rcnn_loss: 0.47681, rpn_cls_loss: 0.047
```
# infer
sh run_eval.sh [VALIDATION_DATASET_PATH] [CHECKPOINT_PATH]
sh run_eval.sh [VALIDATION_ANN_FILE_JSON] [CHECKPOINT_PATH]
```
> checkpoint can be produced in training process.
> As for the COCO2017 dataset, VALIDATION_ANN_FILE_JSON is refer to the annotations/instances_val2017.json in the dataset directory.
> checkpoint can be produced and saved in training process, whose folder name begins with "train/checkpoint" or "train_parallel*/checkpoint".
#### Result