From 0d20b99ad16d6dfdbc20f7f9f45dbb4c9a943494 Mon Sep 17 00:00:00 2001 From: yuchaojie Date: Tue, 27 Oct 2020 19:16:42 +0800 Subject: [PATCH] update transformer's ms version --- model_zoo/official/nlp/transformer/README.md | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/model_zoo/official/nlp/transformer/README.md b/model_zoo/official/nlp/transformer/README.md index dbdd8037607..607b6474f9d 100644 --- a/model_zoo/official/nlp/transformer/README.md +++ b/model_zoo/official/nlp/transformer/README.md @@ -33,6 +33,8 @@ Specifically, Transformer contains six encoder modules and six decoder modules. # [Dataset](#contents) +Note that you can run the scripts based on the dataset mentioned in original paper or widely used in relevant domain/network architecture. In the following sections, we will introduce how to run the scripts using the related dataset below. + - *WMT Englis-German* for training. - *WMT newstest2014* for evaluation. @@ -236,8 +238,8 @@ Parameters for learning rate: | Parameters | Ascend | | -------------------------- | -------------------------------------------------------------- | | Resource | Ascend 910 | -| uploaded Date | 06/09/2020 (month/day/year) | -| MindSpore Version | 0.5.0-beta | +| uploaded Date | 09/15/2020 (month/day/year) | +| MindSpore Version | 1.0.0 | | Dataset | WMT Englis-German | | Training Parameters | epoch=52, batch_size=96 | | Optimizer | Adam | @@ -255,8 +257,8 @@ Parameters for learning rate: | Parameters | Ascend | | ------------------- | --------------------------- | | Resource | Ascend 910 | -| Uploaded Date | 06/09/2020 (month/day/year) | -| MindSpore Version | 0.5.0-beta | +| Uploaded Date | 09/15/2020 (month/day/year) | +| MindSpore Version | 1.0.0 | | Dataset | WMT newstest2014 | | batch_size | 1 | | outputs | BLEU score |